W4_A1_Ex-9_L_model_backward_Index Error_Boolean mismatch

Hello! Can you help me with Exercise 9 - L_model_backward

Y review my code and this one seems correct, but I have the follow result:

IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
58 # YOUR CODE STARTS HERE
59 current_cache = caches[l]
—> 60 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, activation = “relu”)
61 grads[“dA” + str(l)] = dA_prev
62 grads[“dW” + str(l + 1)] = dW_temp

in linear_activation_backward(dA, cache, activation)
22 # dA_prev, dW, db = …
23 # YOUR CODE STARTS HERE
—> 24 dZ = relu_backward(dA, activation_cache)
25 dA_prev, dW, db = linear_backward(dZ, linear_cache)
26 # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 1 but corresponding boolean dimension is 3

Can you help me, please!

For hidden layers, dAL is not the input of linear_activation_backward. It should be dA of (l+1) layer.

My mistake was the iteration in the derivate and an incorrect variable name. Thanks for your help!