# Week 1 Exercise 3 gradient_check_n

I’ve tried everything and can’t determine what I am doing wrong. I wrote all the code for the gradient check function and then fixed the problems in the backdrop function, but I am still slightly off the expected value and am not getting a correct backdrop implementation result either. Please help!

There is a mistake in the backward propagation! difference = 0.33333334789859204
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-12-c57ee5e9e05a> in <module>
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for numerator or denominator"
----> 8 assert np.any(np.isclose(difference, expected_values)), "Wrong value. It is not one of the expected values"

AssertionError: Wrong value. It is not one of the expected values


Check this error first it tells you are not using what it needs! This is probably your issue!

I didn’t get that assertion error - I am using np.linalg.norm, I only got the second assertion error “it is not one of the expected values”.

1 Like

Are you failing Exercise 2 - backward_propagation? If so, first you need to correct it and then move on to the next one.

1 Like

One common mistake is not computing the gradapprox as a vector and only using one component of it in the final difference calculation.

Another landmine is the order of operations on the \frac {1}{2\epsilon} computation. Try this and watch what happens:

m = 5.
x = 1. / 2. * m
y = 1./(2. * m)


If you’re expecting x and y to have the same value, you’re in for a nasty surprise.

2 Likes

I used the correct implementation for that computation, and my implementation for Exercise 2 passed all the checks, but I still can’t find the issue. I’ve checked every line dozens of times, and fixed the errors in the backward_propagation_n function. What should I try? Thank you for the help!

Please check your DMs for a message from me about how to proceed. You can recognize a DM (Direct Message) by the little envelope icon.

To close the loop on the public thread, it was a classic “copy/paste” error: some code was duplicated from the theta_plus case to the theta_minus case and not everything got converted completely.

2 Likes