# W2_A2_Optimal "nudge" dx given for each node in computational graphs

Thanks for sharing the link. We will talk about Gradient Checking in Course 2 of this series, which is a way to use a relatively crude version of numeric differentation to confirm that our back prop code is correct. Here’s a thread which links to another bit of math to do with one-sided vs two-sided finite differences that were mentioned in that section of the course.

It might be worth a few words about Prof Ng’s pedagogical approach here. These courses are designed so that they do not require knowledge of calculus. He doesn’t show the derivations of the formulas for back propagation, but just presents them. There is plenty of material out there for people who have the math background to find these derivations.

Then for each type of neural network that we study (Fully Connected nets here in Course 1 and Course 2), Convolutional Nets (Course 4) and Recurrent Nets (Course 5), Prof Ng will lead us through the construction of the core parts of the algorithm ourselves in python, including the basics of back propagation. When we’re writing it by hand in python, we do all the derivatives analytically and then just write the code for those. But as soon as we have mastered the core algorithm, he moves on to using the TensorFlow framework for building more complex solutions. That happens for the first time in Course 2 Week 3. After that, all the serious solutions are built using TF, so we no longer have to worry about the derivatives and everything is taken care of for us “under the covers” by the autodiff mechanisms in TF.

That’s how things work in real applications as well: the State of the Art these days is so complex that nobody can write all their own code in python. If you are a researcher at the leading edge who is literally creating new algorithms, then you’ll be working in python to prototype things and prove whether they work or not. As soon as you have something that’s proven to work, you publish the paper and then you or someone else writes the code for TF, PyTorch and all the other frameworks to implement your new concept, so that it becomes part of the new SOTA.