Gradient_checking_n Week 1 Assignment 3

Hi everyone,

I’m doing the last exercise of the assignment, and this is the message I get.

There is a mistake in the backward propagation! difference = 0.2850931567761623

In the next block, this value: 0.2850931567761623 is used for the verification and when I submitted my work I got 100. But I still don’t understand why is it valid ?

I thought it was supposed to be onlu 1e-17 or something like that.

Thanks

The thing to notice is that they added some “fake” intentional mistakes in the back propagation code to exercise your gradient checking logic. If you leave the mistakes there, then you get the answer you show. But if you fix the mistakes and rerun, then you a very small value for the error. The test code checks for either value, so that you get “yes” as the answer whether or not you fixed the intentional backprop bugs.

Oh I had noticed at least one mistake, that I fixed but didn’t check the rest.

I’m gonna go check it now. I may have also missed the part where they say there are mistakes to fix.

Update: Took me 1 second. Thanks @paulinpaloalto

I have the same problem
"There is a mistake in the backward propagation! difference = 0.8461538460709301
"

and I coundt find what is wrong with the backward propagation function

That error means your gradient approximation logic is not correct. You should get one of the two values that you see in the assertion in the test logic. The first step is to carefully check your code and compare it to the instructions and formulas shown in assignment.

Also please check your DMs for a message from me about how to proceed.