I am getting the following assertion error in my code.
----------------------------------ERROR BEGIN----------------------------------------
dA0 = [[ 0. 2.15735098]
[ 0. -1.34961884]
[ 0. -1.32395514]
[ 0. -3.05819414]]
dA1 = [[ 0.57243624 -1.81702513]
[-0.62840214 1.99467189]
[ 0.07375161 -0.23410211]]
dW1 = [[1.69300653 0.32230299 0.56963799 0.43355858]
[0. 0. 0. 0. ]
[0.21812379 0.04152491 0.07339109 0.05585887]]
dW2 = [[-1.65635369 -0.53809236 -0.14346586]]
db1 = [[-0.90851256]
[ 0. ]
[-0.11705105]]
db2 = [[0.60778316]]
Error: Wrong output for variable dA1.
Error: Wrong output for variable dW2.
Error: Wrong output for variable db2.
Error: Wrong output for variable dA0.
Error: Wrong output for variable dW1.
Error: Wrong output for variable db1.
2 Tests passed
1 Tests failed
AssertionError Traceback (most recent call last)
in
9 print("db2 = " + str(grads[‘db2’]))
10
—> 11 L_model_backward_test(L_model_backward)
~/work/release/W4A1/public_tests.py in L_model_backward_test(target)
442 ]
443
→ 444 multiple_test(test_cases, target)
445
446 def update_parameters_test(target):
~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
140 print(‘\033[92m’, success," Tests passed")
141 print(‘\033[91m’, len(test_cases) - success, " Tests failed")
→ 142 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
143
AssertionError: Not all tests were passed for L_model_backward. Check your equations and avoid using global variables inside the function.
Expected output:
dA0 = [[ 0. 0.52257901]
[ 0. -0.3269206 ]
[ 0. -0.32070404]
[ 0. -0.74079187]]
dA1 = [[ 0.12913162 -0.44014127]
[-0.14175655 0.48317296]
[ 0.01663708 -0.05670698]]
dW1 = [[0.41010002 0.07807203 0.13798444 0.10502167]
[0. 0. 0. 0. ]
[0.05283652 0.01005865 0.01777766 0.0135308 ]]
dW2 = [[-0.39202432 -0.13325855 -0.04601089]]
db1 = [[-0.22007063]
[ 0. ]
[-0.02835349]]
db2 = [[0.15187861]]
------------------------------------ERROR END-------------------------------------
Here’s my understanding of this section of the code:
- We compute the derivative dAL for the last layer and use it in the linear_backward function to compute dWL, dbL, dA_prev(L-1).
- Then this dA_prev(L-1) is fed to the linear_activation_backward function to compute the dW(L-1), db(L-1) and dA(L-2) and so on…
Can anyone help me figure out the issue that I am facing?