CW1 3rd assignment: Gradient checking - unexpected value error

There is a mistake in the backward propagation! difference = 0.6623181774407311

AssertionError Traceback (most recent call last)
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
----> 8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”

AssertionError: Wrong value. It is not one of the expected values

I’m receiving the above error. I went through my code multiple times but couldn’t find what the actual problem is. Please let me know the work around if anyone has found the solution to this. Thanks!

I’m using np.linalg.norm values for the numerator and denominator

Please check DM from me for further instructions.

Thank you for sending me your code. Your implementation of the gradapprox[i] is incorrect. Check the formula again:

gradapprox[i] = \frac{J^{+}_i - J^{-}_i}{2 \varepsilon}

It’s J, not theta, right?

Thank you so much for the response and your time. This resolved my issue!