W1A3: Wrong grad_check difference but the result makes me wonder if I'm missing a 1-difference

So the result I get from gradient_check is 0.99999999999998. Well I’ve been through the function quite a few times but either a coincidence (a big one) or I’m wondering if I missed just adding difference= 1 - difference. If that was the case, my result would be valid and 1.9984014443252818e-14.

This is the first time I’ve got stuck on an exercise since the first course, I’ve read the function so many times and I believe I’m good. But that 0.99999999999998 makes me wonder if it’s just the final calculation that is missing the adjustment.

Thank you in advance.

Adding to this, I just did the last function and the result was again There is a mistake in the backward propagation! difference = 0.99999999999999

This can’t be a coincidence, can it? I must be doing something whereas I end up with the 1 - result, instead of result.

Any help before the deadline would be appreciated, thank you.

The issue with grad_check was:

gradapprox = (J_plus-J_minus) / (2*epsilon) OK
gradapprox = (J_plus-J_minus) / 2*epsilon NOT OK

My bad. Problem solved.

Had two other errors:

  1. Not noticing the difference calculation was outside a for loop: Week 1 Gradient Checking gradient_check_n - Deep Learning Specialization / DLS Course 2 - DeepLearning.AI
  2. A copy paste error on calculating tetha_minus[i]: Course-2 : Week -1 Assignments, Programming Assignment: Gradient Checking - Deep Learning Specialization / DLS Course 2 - DeepLearning.AI
1 Like