# W1 gradient check, all test passed got 60/100

Hi, here my results

Ex1
J = 8
All tests passed.

Ex2
dtheta = 3
All tests passed.

Ex3
3 2.9999999995311555
Your backward propagation works perfectly fine! difference = 1.221195460051932e-20

Ex4
Your backward propagation works perfectly fine! difference = 2.827876250996049e-14
But I got this error:
AssertionError Traceback (most recent call last)
in
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), â€śYou are not using np.linalg.norm for numerator or denominatorâ€ť
----> 8 assert np.any(np.isclose(difference, expected_values)), â€śWrong value. It is not one of the expected valuesâ€ť

AssertionError: Wrong value. It is not one of the expected values

1 Like

It looks like you are making a mistake in computing the difference in both cases. Hereâ€™s what I get as the answer for Ex3:

`Your backward propagation works perfectly fine! difference = 2.919335883291695e-10`

Notice that in both cases, your answer is about the square of the correct answer at least in terms of order of magnitude. BTW in Ex4, I assume you had already fixed the intentional bugs that they put into the back prop function.

I suggest you carefully compare your implementation of the difference calculation to the math formulas given in the assignment and the lecture. Are you sure you didnâ€™t use the square of the norm in the numerator?

1 Like

Note that in the formulas when they say ||v||_2, thatâ€™s a â€śsubscript 2â€ť there, meaning that is the 2-norm. Itâ€™s not saying to square the norm. That would look like this: ||v||_2^2.

1 Like

Thanks for the answer, I got it.

1 Like