DLS2 Week1 Assignment 3 gradient_check_n


I got stuck on the gradient_check_n function. The result is slightly different from the expected value:

There is a mistake in the backward propagation! difference = 0.24389144784740008
AssertionError Traceback (most recent call last)

  •  6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]*
  •  7 assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for numerator or denominator"*

----> 8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”

AssertionError: Wrong value. It is not one of the expected values

However, after I corrected the errors in backward_propagation_n and rerun gradient_check_n function, the output is correct:

Your backward propagation works perfectly fine! difference = 1.1890912740685776e-07

There is a post having precisely the same issue as I have, even the output number. (DLS2 Week 1, Assignment 3, gradient_check_n subtle difference) But I didn’t find any solution there.

Best regards

Interesting. An error in the 3rd decimal place is not a rounding error. We’re doing 64 bit arithmetic here, so rounding errors are typically of the order 10^{-16}, although they can compound in certain pathological cases.

It’s also interesting that you found a thread more than a year old with apparently the same issue. I assume you have already been carefully through your code and compared it to the instructions, so think we need to look at the source code to advise on this. Please check your DMs for a message from me about how to proceed.

Just to close the loop here, I had a private conversation with James and it turns out there is a subtle bug in the implementation of the gradient checking formulas. Look carefully at the denominator in this expression:

diff = \displaystyle \frac {||grad - gradapprox||}{||grad|| + ||gradapprox||}

It turns out that you get something very close to the correct answer if you do this instead:

diff = \displaystyle \frac {||grad - gradapprox||}{2 * ||grad|| }

1 Like

If anyone getting the difference values not as expected values eg : 0.51, 0.99
Check for the calculations in theta_plus and theta_minus. You might have copy pasted the same calculation and it can give weird results as mentioned. I did the same mistake and took me hours to realize the mistake