Course 2 week 1 gradient checking, last task


What could be the cause of the erroneous result? In a very similar way I wrote the previous gradient_check function where the result was 1.221195460051932e-20.

Check your code again, carefully, and compare it with the formula given to you. Maybe a typo or a small mistake…

Hi @Kajetan_Frackowiak,

Since you are using np.linalg.norm already, you don’t need to use **2.

Best,
Mubsi

1 Like

Hello,

you error log indicates there is issue in your backward propagation, the difference value is incorrect

there is also mention that you are not using np.linalg.norm either in numerator and denominator, so that’s why your expected values differs from the outcome for difference. check those code.

Regards
DP

It’s interesting sometimes that very similar mistakes that we’ve never seen before happen very close to each other in time. That other thread was from yesterday. Like a “disturbance in The Force”. :nerd_face: