Week 4 Exercise 9 L_Model_Backward Function

Hi, I am getting an error on the L_Model_ Backward function on the relu activation part. What is the index for l that needs to be used ?


IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
59 current_cache=caches[1]
60
—> 61 dA_prev_temp,dW_temp,db_temp=linear_activation_backward(grads[“dA”+str(l+1)],current_cache, activation = “relu”)
62 grads[“dA”+str(l)]=dA_prev_temp
63 grads[“dW”+str(l+1)]=dW_temp

in linear_activation_backward(dA, cache, activation)
22 # dA_prev, dW, db = …
23 # YOUR CODE STARTS HERE
—> 24 dZ = relu_backward(dA, activation_cache)
25 dA_prev, dW, db = linear_backward(dZ, linear_cache)
26

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 3 but corresponding boolean dimension is 1

You can take a look at the code in relu_backward to see what it is doing. I think that when you do that you’ll find that means that the dA value you passed does not match the cache value that you passed (which is Z, the “activation cache”). So why do they not match?