Course 2, week 1, Grad Check

Hi guys,
In Course 2, week 1, Grad Check, Exercise 2 - backward propagation:
Doesn’t this method just return x? If so, there is no difference in the returned values of J+ and J- and J? In other words, what is the effect of theta?

Hi, @Baraa.

backward_propagation is not used to compute J_plus or J_minus.

What you are doing is using the definition of the derivative to approximate the gradient (with J_plus and J_minus) and comparing it with the gradient returned by backward_propagation. If backward_propagation was implemented correctly, both values should be very close.

The parameter theta has no effect in backward_propagation because it does not appear in the derivative of J with respect to theta (it may for a different cost function).

Did you complete the assignment? :slight_smile:

2 Likes

Thanks @nramon. Apparently, I was confused about how gradient checking is working. I am about to complete the assignment. Your help is really appreciated.

1 Like

Happy to help, @Baraa. Good luck with the assignment!

Thank you! @nramon