Hello, so I finished the 1st week of course 5. In the coding examples provided for implementing an LSTM NN in numpy, there were some omitted details. This wasn’t required work. I’m mainly curious on how to to get the derivatives of the final activations for da(n_a, M, T) and ca(n_a, M, T) right before starting backpropogation.
I figured these weren’t shown because getting derivatives for the final activations was already introduced in the first course of the deep learning specialization series. With that said, I think I’ve figured out da, however I’m not quite sure how to get the derivatives for dc (c_next). Can anyone help me with that?