Gradient_check_n

I corrected dW2 and db1 and this is my code:

{moderator edit - solution code removed}

but I get this output:

There is a mistake in the backward propagation! difference = 0.33333334789859204

AssertionError Traceback (most recent call last)
in
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
----> 8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”

AssertionError: Wrong value. It is not one of the expected values

Can anybody help about this problem?

As I think I mentioned in one of our earlier conversations, we’re not supposed to post our source code publicly. It does make it a lot easier to help, but in the cases in which we need to see the code, there are private ways to do that.

Have a look at your calculation of theta_minus[i]: there is a “copy/paste” error in that line of code that probably accounts for the wrong result.

1 Like

Excuse me. Thanks for your mentioning it.