W4_A1_Ex-9_IndexError: unmatched boolean index

Hello I have this error and couldn’t fix it

the error

IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
47 #(approx. 5 lines)
48 current_cache = caches[1]
—> 49 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(grads[“dA” + str(L-1)], current_cache,activation = “relu”)
50 grads[“dA” + str(l)] = dA_prev_temp
51 grads[“dW” + str(l + 1)] = dW_temp

in linear_activation_backward(dA, cache, activation)
19 if activation == “relu”:
20 #(≈ 2 lines of code)
—> 21 dZ = relu_backward(dA, activation_cache)
22 dA_prev, dW, db = linear_backward(dZ, linear_cache)
23 # YOUR CODE STARTS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 3 but corresponding boolean dimension is 1

Welcome to the community !

First of all, please remove your code from here. Pasting code is not recommended.
Regarding your problem, it may be a problem of “font”…
Within a for loop, you used a fixed index to retrieve current_case from caches. It needs to be an index, not a numeric value. :slight_smile:
Please look at carefully. The index is “l” (lower case of L), not “1” (one).

2 Likes

One additional thing…
In a for loop of “l”, you used a fixed variable to get dA from a dictionary. It works only once, and should pass this local test program since layer is small. But, a grader may set more layers which causes the problem in here. So, please revisit parameters to call linear_activation_backward() in a for loop as well.

Hi, I’m getting a similar index issue, but different bounds. I’m definitely using the letters “l” and “L” for the caches and not the number 1. Any idea what’s going on? Thank you.

EDIT: I found out the issue and fixed it. It’s working now.


IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
58 # YOUR CODE STARTS HERE
59 current_cache = caches[l]
—> 60 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, activation = “relu”)
61 grads[“dA” + str(l)] = dA_prev_temp
62 grads[“dW” + str(l + 1)] = dW_temp

in linear_activation_backward(dA, cache, activation)
22 # dA_prev, dW, db = …
23 # YOUR CODE STARTS HERE
—> 24 dZ = relu_backward(dA, activation_cache)
25 dA_prev, dW, db = linear_backward(dZ, linear_cache)
26 # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 1 but corresponding boolean dimension is 3

1 Like

Hi, I’m glad you solved that. Do you remember how you did it? I’m having the same problem as the same point and I would appreciate some help.

Thank you very much.

1 Like

Hello Jose,

Besides, what has been proposed as a solution earlier, did you check on the layer being passed for the activation “relu”?

To simplify it more, here’s the coputation:

—> 60 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, activation = “relu”)
[/quote]

Can you get it now? Thanks!

I’m trying to understand why the test is checking dA0, shouldn’t we stop at layer 1? and how can we calculate it since the hidden layer loop stops at l=1

for l in reversed(range(L-1)):

Thanks

Me too!

I’m afraid that I don’t understand well here. Regarding the line for dAL, it should be ‘sigmoid’ instead of ‘relu’, right? This is one layer before output layer. Then in the loop, it should be dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dA_prev_temp, current_cache, activation = “relu”), right?

I think i have some similar problems here. In the loop with ‘relu’, i refer to grads[“dA” + str(l+1) in the linear_activation_backward function. Is this ok?
These are my error messages:

IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
59 # YOUR CODE STARTS HERE
60 current_cache = caches[1]
—> 61 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(grads[“dA” + str(l+1)] , current_cache, activation = “relu”)
62 grads[“dA” + str(l)] = dA_prev_temp
63 grads[“dW” + str(l+1)] = dW_temp

in linear_activation_backward(dA, cache, activation)
22 # dA_prev, dW, db = …
23 # YOUR CODE STARTS HERE
—> 24 dZ = relu_backward(dA, activation_cache)
25 dA_prev, dW, db = linear_backward(dZ, linear_cache)
26 # YOUR CODE ENDS HERE

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 3 but corresponding boolean dimension is 1

@MissTulip, since this thread has been cold for over a year, it’s probably better if you start a new thread for your question.

It is a bad idea to hard-code the index value you are using to select the caches entry there: note that the font here is maybe part of the problem. Please be careful to note that "1" (the number one) is not the same thing as "l" (the letter ell).

2 Likes