Calculating gradients using backpropagation using frameworks like TF

Prof. Ng has mentioned multiple times that we need not bother about back propagation, the framework takes care of it. So is it true that the framework finds the gradients and back propagation on its own for any new complex networks which uses TensorFlow or PyTorch to implement feed forward steps?

Yes. This is because the backpropagation software is included in each layer that you might include in your model. It’s handled for you.

I can’t say about PyTorch, it’s not my preferred tool.

1 Like

Thank you. :slightly_smiling_face: