For the life of me I can’t figure out what I’m doing wrong here. I see in the output, it’s telling me that I’m not using np.linalg.norm for numerator or denominator … but I am! I need to understand what I’m doing wrong because I am losing some momentum here in this course…please help!
There is a mistake in the backward propagation! difference = 2.4577112074727173e-07
Error: Wrong output
0 Tests passed
1 Tests failed
---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
<ipython-input-12-0b57cb811a44> in <module>
6 assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for numerator or denominator"
7
----> 8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)
~/work/release/W1A3/public_tests.py in gradient_check_n_test(target, parameters, gradients, X, Y)
56 ]
57
---> 58 single_test(test_cases, target)
59
60 def predict_test(target):
~/work/release/W1A3/test_utils.py in single_test(test_cases, target)
122 print('\033[92m', success," Tests passed")
123 print('\033[91m', len(test_cases) - success, " Tests failed")
--> 124 raise AssertionError("Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
125
126 def multiple_test(test_cases, target):
AssertionError: Not all tests were passed for gradient_check_n. Check your equations and avoid using global variables inside the function.