# Course 1 Week 4 programming assignment #2 error

hello,

In the week 4 programming assignment “Building_your_Deep_Neural_Network_Step_by_Step” and in the section 6.3 - L-Model Backward, the formula for dAL provided is not correct

Given :
dAL = - (np.divide(Y, AL) - np.divide(1 - Y, 1 - AL)) # derivative of cost with respect to AL
Correct formula:
dAL = - (np.divide(Y, AL) + np.divide(1 - Y, 1 - AL)) # derivative of cost with respect to AL

I have used the given formula instead of the correct one because if I use the correct one then test doesn’t pass.
Thank you

The formula you show as “Correct” is actually not correct. It is the first one (Given) that is correct, which is why using that one works. Look at how the parentheses work in that expression. The signs of the two terms are opposite because of the Chain Rule: you get an extra factor of -1 in the second term, because of the derivative of (1 - AL). There is nothing wrong with the material.

Hello,

According the video tutorial in Course 1 Week3 “Backpropagation Intuition (Optional)” at time 1.18 the formula for Loss function L(a,y) = -yloga - (1-y)log(1-a) and thus dL(a,y) = (-y/a) + ((1-y)/(1-a)) according to chain rule.

And Course 1 Week 4 " Forward and Backward Propagation" at time 7.05 it says da = (y/a)+((1-y)/(1-a))

It is all consistent with the “Given” formula you showed above. You apparently transcribed it incorrectly:

The minus sign on the first term is pretty clear.

1 Like