Week 4 assignment 1 ex 9

IndexError Traceback (most recent call last)
in
1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
3
4 print("dA0 = " + str(grads[‘dA0’]))
5 print("dA1 = " + str(grads[‘dA1’]))

in L_model_backward(AL, Y, caches)
61 # YOUR CODE STARTS HERE
62 current_cache=caches[1]
—> 63 dA_prev_temp, dW_temp, db_temp = linear_activation_backward(grads[“dA” + str(l + 1 )], current_cache, activation = “relu”)
64 grads[“dA” + str(l)] = dA_prev_temp
65 grads[“dW” + str(l + 1 )] = dW_temp

in linear_activation_backward(dA, cache, activation)
22 # dA_prev, dW, db = …
23 # YOUR CODE STARTS HERE
—> 24 dZ = relu_backward(dA, activation_cache)
25 dA_prev, dW, db = linear_backward(dZ, linear_cache)
26

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
54
55 # When z <= 0, you should set dz to 0 as well.
—> 56 dZ[Z <= 0] = 0
57
58 assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 3 but corresponding boolean dimension is 1

please help i have been stuck for hours

This means that the arguments you are passing down from L_model_backward to linear_activation_backward are not correct. They don’t match somehow. To debug this, you can start by looking at the source for relu_backward to see what that error really means. Click “File → Open” and then open the file dnn_utils.py. Then add a print statement in L_model_backward or in linear_activation_backward to show the shape of the dA value. Once you know that, compare that to the “dimensional analysis” for the test case that is being used there. What shape should the dA value be? Why is it coming out incorrectly?

1 Like

Actually it may be the cache value that is incorrect. You can see from the exception trace that in your code you have this line:

That’s a bug, right? You are always referencing the second element in the caches array instead of the one that matches the appropriate layer based on where you are in the “for” loop. I think you meant l (lower case ell) there and not 1. They look pretty similar, so it might be hard to “see” that bug.

2 Likes

Thank you so much !
It did the work

1 Like