Week_1 assignment_3 gradient_checking_n function error

Hi,
I am facing error in Week 1 assignment 3 Gradient checking_n function
kindly help

here is a mistake in the backward propagation! difference = 1.0
Error: Wrong output
0 Tests passed
1 Tests failed

AssertionError Traceback (most recent call last)
in
6 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
7
----> 8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)

~/work/release/W1A3/public_tests.py in gradient_check_n_test(target, parameters, gradients, X, Y)
56 ]
57
—> 58 single_test(test_cases, target)
59
60 def predict_test(target):

~/work/release/W1A3/test_utils.py in single_test(test_cases, target)
122 print(’\033[92m’, success," Tests passed")
123 print(’\033[91m’, len(test_cases) - success, " Tests failed")
→ 124 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
125
126 def multiple_test(test_cases, target):

AssertionError: Not all tests were passed for gradient_check_n. Check your equations and avoid using global variables inside the function.

Hi @sathishceog,

Checkout the description of the exercise and the error message. You should be using np.linalg.norm as part of the function you are writing.

Cheers,