DLS Course 2 Week 1- Assignment 3- Exercise 4- gradient_check_n

I am using np.linalg.norm for numerator or denominator both, but still getting below error-
There is a mistake in the backward propagation! difference = 1.0
Error: Wrong output
0 Tests passed
1 Tests failed

AssertionError Traceback (most recent call last)
in
6 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
7
----> 8 gradient_check_n_test(gradient_check_n, parameters, gradients, X, Y)

~/work/release/W1A3/public_tests.py in gradient_check_n_test(target, parameters, gradients, X, Y)
56 ]
57
—> 58 single_test(test_cases, target)
59
60 def predict_test(target):

~/work/release/W1A3/test_utils.py in single_test(test_cases, target)
122 print(’\033[92m’, success," Tests passed")
123 print(’\033[91m’, len(test_cases) - success, " Tests failed")
→ 124 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
125
126 def multiple_test(test_cases, target):

AssertionError: Not all tests were passed for gradient_check_n. Check your equations and avoid using global variables inside the function.

Please help !!!

1 Like

Fixed by the original poster before I contacted him.

It would be great if you could share the solution with other learners, @Ruchi.

Good luck with the rest of the course :slight_smile:

For calculating

[Removed solution code]

What i was doing wrong is "Not subtracting epsilon from theta_minus[i]"

3 Likes

Exactly my issue as well. And a error resulted from copy/pasting code from my own “Plus-code” above. I did it and had to spend more time troubleshooting than typing it in the first place. Lazy sometimes does not pay off … Almost never…

1 Like

DLS Course 2 Week 1- Assignment 3- Exercise 3- gradient check
I am stuck on the previous exercise, 3 - gradient check.
“There is a mistake in the backward propagation! difference = 0.3333333333333333”
Then it says All tests passed.
The next line says "Congrats, the difference is smaller than the 10−7 threshold. So you can have high confidence that you’ve correctly computed the gradient in backward_propagation() ".
Did I get it correct or did I get it wrong?

Fixed by @yodester too.

Pay attention to operator precedence!

Good luck with the rest of the course :slight_smile: