W3 A1 | Ex-6 | Where were dZ [1] & dW[1] derivative equations introduced?

Hi, below screen shot is from https://www.coursera.org/learn/neural-networks-deep-learning/lecture/Wh8NI/gradient-descent-for-neural-networks

Not able to find basis of equations in red rectangle. I believe it should be somewhere in Week 3 lecture. But I am not able to find it. Please help.


I don’t think Prof Ng shows that anywhere. All this is just applications of the Chain Rule, but you need to know some matrix calculus as well. The high level point is that this course is designed not to require any knowledge of calculus, so he doesn’t show a lot of the derivations. The good news is that you don’t need to know calculus. The bad news is that means you just have to take his word for the back propagation formulas.

If you do have a calculus background, here’s a thread with links to the derivations and other background needed to understand them.