I’ve tried everything and can’t determine what I am doing wrong. I wrote all the code for the gradient check function and then fixed the problems in the backdrop function, but I am still slightly off the expected value and am not getting a correct backdrop implementation result either. Please help!

There is a mistake in the backward propagation! difference = 0.33333334789859204
---------------------------------------------------------------------------
AssertionError Traceback (most recent call last)
<ipython-input-12-c57ee5e9e05a> in <module>
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), "You are not using np.linalg.norm for numerator or denominator"
----> 8 assert np.any(np.isclose(difference, expected_values)), "Wrong value. It is not one of the expected values"
AssertionError: Wrong value. It is not one of the expected values

I used the correct implementation for that computation, and my implementation for Exercise 2 passed all the checks, but I still can’t find the issue. I’ve checked every line dozens of times, and fixed the errors in the backward_propagation_n function. What should I try? Thank you for the help!

To close the loop on the public thread, it was a classic “copy/paste” error: some code was duplicated from the theta_plus case to the theta_minus case and not everything got converted completely.