Update 2:
Oh great! the “mistake in backward propagation” is actually the correct output!
(This part is solved, I have now moved on to “There is a mistake in the backward propagation! difference = 0.2850931567761623”)
Hi,
I followed the instructions in the pseudocode, changing the forward_propagation_n(x, y, …) part to X and Y (because lowercase gives “name y is undefined”).
Now it says “wrong value”.
Here is the error message below:
There is a mistake in the backward propagation! difference = 1.0
AssertionError Traceback (most recent call last)
in
6 expected_values = [0.2850931567761623, 1.1890913024229996e-07]
7 assert not(type(difference) == np.ndarray), “You are not using np.linalg.norm for numerator or denominator”
----> 8 assert np.any(np.isclose(difference, expected_values)), “Wrong value. It is not one of the expected values”
AssertionError: Wrong value. It is not one of the expected values
Are you calculating gradapprox
for each parameter using J_plus
and J_minus
for that parameter?
Do keep in mind that when you skip an index, broadcasting will kick in and apply the operation to all elements.
Solved this part. Turned out to be an error from theta_minus (I copy pasted from theta_plus, but forgot to modify “+ epsilon” to “- epsilon”.
Anyway, now I have an error “There is a mistake in the backward propagation! difference = 0.2850931567761623”
Hello @qwertyuiop1234567,
Please continue to read the description below that message, and you will find out that it is indeed one of the two expected messages. The description will also give you some hint on how to get rid of that expected “error” if you would like to.
Cheers,
Raymond
1 Like
Please check if you are computing difference
exactly once outside the for loop after computing gradapprox
for all parameters. Here are some values (should be close on your version of numpy as well) to check your work:
numerator = 8.046945911183278e-07
denominator = 6.77035941381653
difference = 1.1885552035482147e-07
Thanks very much! Your answer helped me figure out my issue.