DLS -> Course 1 _week4

assignment 1 => Excersize 8 " linear_activation_backward"

This function raises an error and I can’t find where is the problem.

ValueError                                Traceback (most recent call last)
<ipython-input-54-1dd7958789b5> in <module>
      1 t_dAL, t_linear_activation_cache = linear_activation_backward_test_case()
      2 
----> 3 t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = "sigmoid")
      4 print("With sigmoid: dA_prev = " + str(t_dA_prev))
      5 print("With sigmoid: dW = " + str(t_dW))

<ipython-input-53-fcfc3f361c38> in linear_activation_backward(dA, cache, activation)
     32         # YOUR CODE STARTS HERE
     33         dZ =  sigmoid_backward(dA, activation_cache)
---> 34         dA_prev, dW, db = linear_backward(dZ, cache)
     35         # YOUR CODE ENDS HERE
     36 

<ipython-input-47-d1bb1dddced5> in linear_backward(dZ, cache)
     14     db -- Gradient of the cost with respect to b (current layer l), same shape as b
     15     """
---> 16     A_prev, W, b = cache
     17     m = A_prev.shape[1]
     18 

ValueError: not enough values to unpack (expected 3, got 2)

Hi @Ahmed_Elshahawy

In the function linear_activation_backward when you call function linear_backward( …) if activation == “sigmoid” you will send the parameter dZ and linear_cache not (dZ, cache) as cache here has 2 parameters only (linear_cache, activation_cache) but in linear_cache here has 3 parameters ( A_prev, W, b)

please feel free to ask any questions,
Thanks,
Abdelrahman

Oh, thanks I got it.

can you help me with this error, too?
" L_model_backward"

IndexError                                Traceback (most recent call last)
<ipython-input-86-3ace16762626> in <module>
      1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
      3 
      4 print("dA0 = " + str(grads['dA0']))
      5 print("dA1 = " + str(grads['dA1']))

<ipython-input-85-758641b197b8> in L_model_backward(AL, Y, caches)
     60         # YOUR CODE STARTS HERE
     61         current_cache = caches[l]
---> 62         dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, activation='relu')
     63         grads["dA" + str(l)] = dW_temp[srt(l)]
     64         grads["dW" + str(l + 1)] = dW_temp[str(l)]

<ipython-input-17-5fb8aaab44b5> in linear_activation_backward(dA, cache, activation)
     22         # dA_prev, dW, db =  ...
     23         # YOUR CODE STARTS HERE
---> 24         dZ = relu_backward(dA, activation_cache)
     25         dA_prev, dW, db = linear_backward(dZ, linear_cache)
     26         # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
     54 
     55     # When z <= 0, you should set dz to 0 as well.
---> 56     dZ[Z <= 0] = 0
     57 
     58     assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 1 but corresponding boolean dimension is 3

Hi @Ahmed_Elshahawy

sorry for previous correct in this error when you call function linear_activation_backward you want to send dA_prev_temp as aparameter in function linear_activation_backward as it is updated every iteration and for input one more …etc

or you can send grads[“dA” + str(l + 1)] as a parameter instead of dA in linear_activation_backward …you can choose any of two solution as you like

Thanks!
Abdelrahman

it generate this

IndexError                                Traceback (most recent call last)
<ipython-input-128-3ace16762626> in <module>
      1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
      3 
      4 print("dA0 = " + str(grads['dA0']))
      5 print("dA1 = " + str(grads['dA1']))

<ipython-input-127-9e816b2f8136> in L_model_backward(AL, Y, caches)
     60         # YOUR CODE STARTS HERE
     61         current_cache = caches[l-1]
---> 62         dA_prev_temp, dW_temp, db_temp = linear_activation_backward(grads["dA" + str(l + 1)], current_cache, activation='relu')
     63         grads["dA" + str(l)] = dA_prev_temp[l]
     64         grads["dW" + str(l + 1)] = dW_temp[l]

<ipython-input-17-5fb8aaab44b5> in linear_activation_backward(dA, cache, activation)
     22         # dA_prev, dW, db =  ...
     23         # YOUR CODE STARTS HERE
---> 24         dZ = relu_backward(dA, activation_cache)
     25         dA_prev, dW, db = linear_backward(dZ, linear_cache)
     26         # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
     54 
     55     # When z <= 0, you should set dz to 0 as well.
---> 56     dZ[Z <= 0] = 0
     57 
     58     assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 3 but corresponding boolean dimension is 1

or this when putting dAL in linear_activation_backward

dA0 = [0.57243624 0.        ]
dA1 = [[ 0.12913162 -0.44014127]
 [-0.14175655  0.48317296]
 [ 0.01663708 -0.05670698]]
dW1 = [-0.55240952  0.17511096  0.6762397 ]
dW2 = [[-0.39202432 -0.13325855 -0.04601089]]
db1 = [-0.2795438]
db2 = [[0.15187861]]
Error: Wrong shape for variable dA0.
Error: Wrong shape for variable dW1.
Error: Wrong shape for variable db1.
Error: Wrong output for variable dA0.
Error: Wrong output for variable dW1.
Error: Wrong output for variable db1.
 1  Tests passed
 2  Tests failed
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-134-3ace16762626> in <module>
      9 print("db2 = " + str(grads['db2']))
     10 
---> 11 L_model_backward_test(L_model_backward)

~/work/release/W4A1/public_tests.py in L_model_backward_test(target)
    442     ]
    443 
--> 444     multiple_test(test_cases, target)
    445 
    446 def update_parameters_test(target):

~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
    140         print('\033[92m', success," Tests passed")
    141         print('\033[91m', len(test_cases) - success, " Tests failed")
--> 142         raise AssertionError("Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
    143 

AssertionError: Not all tests were passed for L_model_backward. Check your equations and avoid using global variables inside the function.

yes as current_cache her is equal caches[l] as you run backward propagation(reverse loop)