Note that lstm_cell_backward returns a gradient dictionary that contains dc_prev, which makes sense as you are doing backpropagation. When you start backpropagation you have to initialize dc_prevt, i.e., create a numpy array of the right size containing zeros. This is the variable you start with. I guess the confusing thing is the name of the parameters of the lstm_cell_backward function.
Think about the value you want to pass as the first parameter to the lstm_cell_backward function. How is this value calculated?
For inspiration you can look at how you implemented the rnn_backward function. The comment above that function states:
“Note that this notebook does not implement the backward path from the Loss ‘J’ backwards to ‘a’. This would have included the dense layer and softmax which are a part of the forward path. This is assumed to be calculated elsewhere and the result passed to rnn_backward in ‘da’. You must combine this with the loss from the previous stages when calling rnn_cell_backward”.
Do you see how you should adjust the first value to pass to lstm_cell_backward?
Good luck and please delete your code once you’ve solved this issue.