W4_A1_Ex-9_ Index error

I read up and it seems this issue is around the cache index but I am not sure where am i messing it up. Help would be appreciated.
This is what I am passing in the l-2 to 0 loop
**
current_cache=caches[l-1]
dA_prev_temp, dW_temp, db_temp=linear_activation_backward(grads[“dA” + str(l+1)], current_cache, “relu”)
**

IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
57 # YOUR CODE STARTS HERE
58 current_cache=caches[l-1]
—> 59 dA_prev_temp, dW_temp, db_temp=linear_activation_backward(grads[“dA” + str(l+1)], current_cache, “relu”)
60 grads[“dA” + str(l)] = dA_prev_temp
61 grads[“dW” + str(l + 1)] = dW_temp

in linear_activation_backward(dA, cache, activation)
22 # dA_prev, dW, db = …
23 # YOUR CODE STARTS HERE
—> 24 dZ=relu_backward(dA, activation_cache)
25 dA_prev, dW, db = linear_backward(dZ, linear_cache)
26 # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 3 but corresponding boolean dimension is 1

That error message means that the shape of the dA value that you passed down to relu_backward does not match the shape of the Z value that was in the activation cache you passed. So how could that happen? I think you’re right that one place to look is to make sure you’re managing the l values correctly. Try putting some instrumentation in your linear_activation_backward code or in the loop in L_model_backward to print those shapes and see what happens.

Also the way to start debugging is to write out the “dimensional analysis” for the test case here. What are the shapes of W, A and Z for every layer in the network. Then compare that to what you are seeing. Why does it come out wrong in your code?

If your linear_activation_backward code passes the test cases, then that means the bug is in L_model_backward, right?

Thanks @paulinpaloalto I was able to figure out the issue. I had the incorrect index being passed in the for loop

That’s great news that you found the solution just based on those hints. Nice work! Onward! :nerd_face: