Week1- Lab3 - Exercise 3 - gradient_check

So, it probably is something stupid but I can not make this work. The forward and backward propagation functions pass the tests but when I get to the gradient_check one it fails. I think I have the formulas right but the result is
“There is a mistake in the backward propagation! difference = 0.14285714314311862”
and it should be
" Your backward propagation works perfectly fine! difference = 7.814075313343006e-11".

I tried printing the values of grad and gradapprox and they are 3 and 4.000000002335469 respectively.
One idea is that the backward_propagation function is not right since i just did dtheta = x but that is the derivative of theta*x so it should be fine.
Also I J_plus and J_minus are computed using the forward_propagation function and grad using the backwar_propagation, is that right?

1 Like

But grad and gradapprox are vectors. How these becomes a scalar?

You don’t need to compute grad. It is already given to you.

1 Like

I added some print statements to my 1D gradient_check function and ran the test cell. Here’s what I see:

J_plus = 12.0000003
J_minus = 11.9999997
gradapprox = 2.9999999995311555
grad 3
Your backward propagation works perfectly fine! difference = 7.814075313343006e-11

So your grad value is correct, but gradapprox is not. Please try adding the prints for J_plus and J_minus and that should narrow down the window for where the bug is.

2 Likes

No, I was asking about the Exercise 3 which does not use vectors and grad is not given.
What you said would be true for Exercise 4.
What really annoyed me is that I did Exercise 4 but I could not managed to make Exercise 3 work jajaja.

Yes, I can see that would be both surprising and frustrating. But that’s got to mean you just aren’t looking hard enough at the details. The formulas are the same. The only difference is you don’t need to iterate over the parameters, since there’s only one.

Check the additional values that I showed versus yours …

ok, I got it, the problem was that I called forward_propagation with the wrong parameters.
Thanks, I forgot to check the other parameters, J_plus and J_minus were wrong.

1 Like

The assignment is a little bit confusing because forward_propagation() and backward_propagation() have very similar names, but completely different purposes and return types:

  • Fwd prop returns the cost. In most DLAI courses, the cost is returned by a compute_cost() function.
  • Bkwd prop returns the gradients. In most DLAI courses, the gradients are returned by a compute_gradients() function.

My fault.

Glad to know you solved it.

Sorry to hijack the thread. I’m experiencing the same issue and I didn’t want to create a duplicate topic. After triple checking my parameters I can’t spot the source of the error. I suspect my logic for forward_propagation may be wrong, but exercise 3 passed successfully.

J_plus:  12.0000004
J_minus:  11.9999996
gradapprox:  3.9999999934536845
grad:  3
numerator:  0.9999999934536845
denominator:  6.9999999934536845
diff:  0.1428571420555532
There is a mistake in the backward propagation! difference = 0.1428571420555532

My gradapprox value is clearly wrong/different from the ones @paulinpaloalto shared, but J_plus/minus are marginally off. Any hints on potential source of my issue? In advance, I appreciate your help.

I got the correct answer. These instructions mislead me:

𝜃+=𝜃+𝜀
𝜃−=𝜃−𝜀
𝐽+=𝐽(𝜃+)
𝐽−=𝐽(𝜃−)

since idea is to find theta_plus of the parameter X rather then value theta.

1 Like