# Exercise 9 - L_model_backward

Hell al
I am getting this error in my code, I appreciate if anyone can help me with that?

``````TypeError                                 Traceback (most recent call last)
<ipython-input-61-3ace16762626> in <module>
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads['dA0']))
5 print("dA1 = " + str(grads['dA1']))

<ipython-input-60-5fb00bf09a9d> in L_model_backward(AL, Y, caches)
43     current_cache = caches
44
---> 45     dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, activation = "sigmoid")
46
47     grads["dA" + str(L-1)] = dA_prev_temp

<ipython-input-49-1771e554f5a6> in linear_activation_backward(dA, cache, activation)
36
37
---> 38         dZ = sigmoid_backward(dA, activation_cache)
39         dA_prev, dW, db = linear_backward(dZ,  linear_cache)
40

~/work/release/W4A1/dnn_utils.py in sigmoid_backward(dA, cache)
74     Z = cache
75
---> 76     s = 1/(1+np.exp(-Z))
77     dZ = dA * s * (1-s)
78

TypeError: bad operand type for unary -: 'tuple'
``````

I find out the solution.

I am getting the same error
bad operand type for unary -: ‘tuple’ but don’t know why??

I am using following code before for loop -
current_cache = caches[L - 1]
calling linear_activation_backward with parameters - dAL, caches, sigmoid as activation function

Also using following statements

Within following for loop using same parameters as above with relu function

Not sure - what I am missing

Also if I try to use current_cache instead of caches I get following error

IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
66 # YOUR CODE STARTS HERE
67 current_cache = caches[l]
—> 68 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, activation= “relu”)
69 grads[“dA” + str(l)] = dA_prev_temp
70 grads[“dW” + str(l+1)] = dW_temp

in linear_activation_backward(dA, cache, activation)
23 # YOUR CODE STARTS HERE
24
—> 25 dZ = relu_backward(dA, activation_cache)
26 dA_prev, dW, db = linear_backward(dZ, linear_cache)
27

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 1 but corresponding boolean dimension is 3

Not sure - what I am missing