W4_A1_Ex-9_L_model_backward: wrong shape and output

Lab ID: bcjqowwi I’ve spent a significant amount of time on this exercise and I’m missing something simple.

dA0 = [[ 0.57243624 0. ]
[-0.62840214 0. ]
[ 0.07375161 0. ]]
dA1 = [[ 0.12913162 -0.44014127]
[-0.14175655 0.48317296]
[ 0.01663708 -0.05670698]]
dW1 = [[-0.55240952 0.17511096 0.6762397 ]]
dW2 = [[-0.39202432 -0.13325855 -0.04601089]]
db1 = [[-0.2795438]]
db2 = [[0.15187861]]
Error: Wrong shape for variable dA0.
Error: Wrong shape for variable dW1.
Error: Wrong shape for variable db1.
Error: Wrong output for variable dA0.
Error: Wrong output for variable dW1.
Error: Wrong output for variable db1.
1 Tests passed
2 Tests failed

AssertionError Traceback (most recent call last)
in
9 print("db2 = " + str(grads[‘db2’]))
10
—> 11 L_model_backward_test(L_model_backward)

~/work/release/W4A1/public_tests.py in L_model_backward_test(target)
442 ]
443
→ 444 multiple_test(test_cases, target)
445
446 def update_parameters_test(target):

~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
140 print(‘\033[92m’, success," Tests passed")
141 print(‘\033[91m’, len(test_cases) - success, " Tests failed")
→ 142 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
143

AssertionError: Not all tests were passed for L_model_backward. Check your equations and avoid using global variables inside the function.

Hi @thebigape,

I went in and took a look at your notebook (thank you for sharing the lab ID before hand!), your cache value in the latter part of ex 9 was wrong, and so was its following line of code. I have left the comments in your notebook.

Best,
Mubsi

I have the same issue. Unable to find the error. I do think my cache is right tho. Can you help. How do I share my lab ID?

Only the course staff (e.g. Mubsi) can look at other people’s notebooks. Mubsi is a pretty busy guy, so it’s better not to depend on his superpowers. :grin: Why don’t you try showing us the error output you are getting and maybe we can offer advice based on that.


Hello sir,
I have a similar issue and here is my error code
any sudggestions on my errors?

Hello John,

Did you follow the suggestions shared by Mubsi in his reply above?
Please have a look at the cache value again. Thanks.


That’s better since the shapes are now correct, but it’s now complaining about the actual values. Getting the shapes correct is a pretty low bar for success :nerd_face: … Now you need to take another pass through the code and compare what you wrote to the formulas shown in the instructions. I added some print statements to show the shapes and some of the intermediate values:

dAL = [[-0.5590876   1.77465392]]
L = 2
dA2.shape = (1, 2)
db.shape (1, 1)
dW2.shape = (1, 3)
db.shape (3, 1)
l = 0
dW1.shape = (3, 4)
db1.shape = (3, 1)
dA0 = [[ 0.          0.52257901]
 [ 0.         -0.3269206 ]
 [ 0.         -0.32070404]
 [ 0.         -0.74079187]]
dA1 = [[ 0.12913162 -0.44014127]
 [-0.14175655  0.48317296]
 [ 0.01663708 -0.05670698]]
dW1 = [[0.41010002 0.07807203 0.13798444 0.10502167]
 [0.         0.         0.         0.        ]
 [0.05283652 0.01005865 0.01777766 0.0135308 ]]
dW2 = [[-0.39202432 -0.13325855 -0.04601089]]
db1 = [[-0.22007063]
 [ 0.        ]
 [-0.02835349]]
db2 = [[0.15187861]]

Notice that all the values you get are quite a bit different than what I show there. But interestingly, some of your values like dA1, dW2 and db2 in your first post with the wrong shapes actually agree with what I show. Hmmmm. Of course everything happens backwards here: we start with the output layer (layer 2 in this test case). So it looks like that step was correct in your first post, but things go off the rails for layer 1 and layer 0. That should be a clue as to where to look for the issues. But all the values are different in your second post.

The other high level point to make is that we are assuming here that your previous functions like linear_backward and linear_activation_backward passed their test cases. So whatever the problem is, it must be in your L_model_backward logic. A perfectly correct subroutine can still give a wrong answer if you pass it bad data, right? So clarity of thought about where to look for the problem is always a useful thing to avoid wasting effort looking in the wrong places. :nerd_face:

I passed the last sections,
and they all show a perfect score.
that is all, I will attempt the “print everything and check” solution