I have completed all the steps still I am getting this error

There is a mistake in the backward propagation! difference = 1.0
Error: Wrong output
0 Tests passed
1 Tests failed

AssertionError Traceback (most recent call last)
in
6 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
7
----> 8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)

~/work/release/W1A3/test_utils.py in single_test(test_cases, target)
122 print(’\033[92m’, success," Tests passed")
123 print(’\033[91m’, len(test_cases) - success, " Tests failed")
→ 124 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
125
126 def multiple_test(test_cases, target):

AssertionError: Not all tests were passed for gradient_check_n. Check your equations and avoid using global variables inside the function.

6 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”

There seems to be something wrong in the way you computed either numerator or denominator. Did you use np.linalg.norm? Double-check that first and let me know if you can’t find the problem

I had the same problem and thought it was something to do with usage of np.linalg.norm. But it turned out I was using the wrong operation in denominator (- instead of +). You also should check carefully the dimension of matrices

AssertionError Traceback (most recent call last)
in
6 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
7
----> 8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)

~/work/release/W1A3/test_utils.py in single_test(test_cases, target)
122 print(’\033[92m’, success," Tests passed")
123 print(’\033[91m’, len(test_cases) - success, " Tests failed")
→ 124 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
125
126 def multiple_test(test_cases, target):

AssertionError: Not all tests were passed for gradient_check_n. Check your equations and avoid using global variables inside the function.

I don’t have any global variables inside the function.

I’m getting the same error and none of the other solutions have worked for me. I’ve double checked that I’m using np.linalg.norm as well as the precedence for the difference. Any suggestions as to what else I can check for?

I’m getting the same error and I have checked that I’m using np.linalg.norm as well as the precedence for the difference. I dont know what else to check for. My theta_plus[i] is different from my theta_minus[i]. What can I do to check?

Hi,
I am getting similar error message, although the result is good as well as the shape of the “difference” variable. Where else may I check? Thank you.

<class ‘numpy.ndarray’>
Your backward propagation works perfectly fine! difference = [[1.1890913e-07]]

AssertionError Traceback (most recent call last)
in
5 difference = gradient_check_n(parameters, gradients, X, Y, 1e-7, True)
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
----> 7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”

AssertionError: You are not using np.linalg.norm for numerator or denominator

That is not the correct type for the difference value. It should be a numpy scalar of type np.float64. Your value is (obviously) a 1 x 1 numpy array. So how did that happen? Maybe some “keepdims” parameters where they weren’t needed?

Here’s a little sample code to show the type of the norm:

Hi,
I’m getting similar error that I can’t solve… I checked all bugs in this topic and couldn’t find any in my code. Can someone please help?

Error:
There is a mistake in the backward propagation! difference = 0.9999999999999866

AssertionError Traceback (most recent call last)
in
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
----> 8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”

AssertionError: Wrong value. It is not one of the expected values

Yes, I used the same formula both in gradient_check() and gradient_check_n(). gradient_check() test passes successfully, and in gradient_check_n() it fails…

Ok, then the mistake is elsewhere in gradient_check_n. There are plenty of other mistakes you could have made. The next one is to check how you implement the "bump by \epsilon" logic: are you sure you didn’t apply it to all elements of \theta at once, rather than just one? That would be a critical difference between the 1D and the multidimensional case.