L_model_Backeard error

I’m not sure if I understand well this function but I’ll share what I did and the error.
if you can provide more material or other explanation, will be appreciated

I’ll remove my code once I get a reply. I Just don’t know how to describe it

You need to pay a bit closer attention to how linear_activation_backward works. Each time you call it, it returns dW and db for the current layer, but dA for the previous layer. You are assigning the dA value is if it is for the same layer.

1 Like

thanks , I’m sorry . I missed it.