Week 4, Exercise_9 L_model_backward

(Moderator edit: code removed)

I am getting above error. Has anyone got the same error and solved it?

Hi @Sabin_Adhikari, you are missing the ‘grads’. Here’s a link that will suffice your query. Also, check the layer you are using for calling out the function. It shouldn’t be dA_prev_temp.

Thanks!

I used grads instead of dA_prev_temp.

but again I’m getting another error
.

(Solution code removed by DLAI Staff, as sharing it publicly is against community’s Honour Code)

I changed it. but still I’m getting the same error.

I found the solution. My formula was wrong

Hi @Sabin_Adhikari, yes, I did mention in the next line to check the layer you were using to calculate linear activation backward for the (l)th layer (relu part). This also was making the equation wrong apart from the missing grads. And, its great to see that you found out the solution!

1 Like

@Rashmi Could you please help clarify what the issue was here?

I have gotten the exact same error and results as Sabin at the top.

I have used the grads dictionary for the dA parameter of the linear_activation_backward function in the relu loop.

I have also double checked all the l indexes in this loop. i don’t see where i have gone wrong.

Hi, Ayham,

Welcome to the community.

As mentioned in my earlier replies on this thread, using wrong activation could change the expected output. Besides, while doing the prop, you need to check the formula appropriately given in the instructions.

Also, check for l (L) layer. Sometimes, they too give you wonders, whether you have used l or 1 at places :slight_smile:

I just checked the equation for my dAL.

Turns out i was missing a pair of brackets after the first minus sign.

Thanks for your help!

Good to hear that!

Keep learning!

1 Like