Deep Learning Specialization, C5, W1

in Lab-1 where we learn to implement back-propagation in LSTMs from scratch, I doubt that the equation of is wrong.

It should have rather than

so the entire equation should be:
[

](Equation Editor for online mathematics - create, integrate and download))%20%20(1%20-%20tanh%5E2(%5Chat%7Bc%7D%5E%7B%3Ct%3E%7D)#0)*

Please let me know if I am wrong. I can share the derivation as well.

If you look carefully at the diagram of the LSTM cell shown there in the notebook and look at how the various graph segments are labelled, note that my reading is that \tilde{c}^{<t>} is already the output of tanh:

\tilde{c}^{<t>} = tanh(\tilde{pc}^{<t>})

So if my reading is correct, than that last multiplicand there in the overall expression is the correct expression for the derivative of tanh at that point in the graph.

It is somewhat confusing how refers to and refers to .

And why dit instead of dut? Yes, there are some slightly odd choices they made there, but notation is always somewhat arbitrary. :grinning: