Exercise 9 - L_model_backward W4_A1

Salam Alykum,

Please supporting me to figure out where is my issue:

Make sure you have passed all the above tests. In this exercise, you just need to call the linear_activation_backward function with correct arguments. Double check the instructions please.

Hi Saif,
yes i passed all the arguments , please check below code and reply i think there an issue in cache and what’s purpose ?

Mentor Edit: Solution Code Deleted

First, your current_cache in the loop is incorrect. It should be l^{th} layer, not L-1. Second, your input arguments to linear_activation_backward (in the loop) are also incorrect. Please pay attention to the comments given to you:
# Inputs: "grads["dA" + str(l + 1)], current_cache

1 Like

Thank you,

Issue solved …

1 Like

Hi Saif,
Please could you check below where is issue:
Mentor Edit: Solution Code Deleted

Check the last part of the formula again:

W^{[l]} = W^{[l]} - \alpha dW^{[l]}

its dW^{[l]} , not W^{[l]} . Same for b.

PS: Sharing your code is not allowed, so, avoid sharing it.

1 Like

Thank you Saif
for you quick response and your effort

Hi Saif,

i submitted the assignment but keep showing 0/100 why ?

I’ve never seen that “duplicate grade id” message before. That sounds like something is structurally wrong with your notebook. Are you sure that you did not accidentally duplicate any of the graded cells?

One thing to try would be to get a clean copy of the notebook using these instructions and then carefully “copy/paste” over your solution code from only the “YOUR CODE HERE” sections.

3 Likes

I looked up that cell id: cell-37b22e0664a4949e and it turns out it is for the initialize_parameters_deep cell. My guess is that you accidentally duplicated that cell. Please check and if you find two copies, delete one of them and try again.

3 Likes

Thank you @paulinpaloalto
you are fantastic .