W 4 A 1 | Ex- 9 | Unable to resolve the dimensional problem

I am almost quite finished alas my intuition nor my debugging skills can aid me in resolving this blockade I am currently in within this excercise.

IndexError                                Traceback (most recent call last)
<ipython-input-308-3ace16762626> in <module>
      1 t_AL, t_Y_assess, t_caches = L_model_backward_test_case()
----> 2 grads = L_model_backward(t_AL, t_Y_assess, t_caches)
      4 print("dA0 = " + str(grads['dA0']))
      5 print("dA1 = " + str(grads['dA1']))

<ipython-input-307-15524019e232> in L_model_backward(AL, Y, caches)
     59         # YOUR CODE STARTS HERE
     60         current_cache = caches[l]
---> 61         dA_prev_temp, dW_temp, db_temp = linear_activation_backward(dAL, current_cache, "relu")
     62         grads["dA" + str(l-1)] = dA_prev_temp
     63         grads["dW" + str(l)] = dW_temp

<ipython-input-287-af391c57315b> in linear_activation_backward(dA, cache, activation)
     22         # dA_prev, dW, db =  ...
     23         # YOUR CODE STARTS HERE
---> 24         dZ = relu_backward(dA, activation_cache)
     25         dA_prev, dW, db = linear_backward(dZ, linear_cache)

~/work/release/W4A1/dnn_utils.py in relu_backward(dA, cache)
     55     # When z <= 0, you should set dz to 0 as well.
---> 56     dZ[Z <= 0] = 0
     58     assert (dZ.shape == Z.shape)

IndexError: boolean index did not match indexed array along dimension 0; dimension is 1 but corresponding boolean dimension is 3

Something is squishing dZ’s dimensions and I can’t figure out what exactly, I keep staring at it and whenever a possibility comes to mind I just end up being thrown back to square one after my efforts in confirming whether or not that is actually the problem.

What am I doing so horribly wrong? Should I go back and attain a better intuition for this cache method and study until I have a fully fleshed out idea as to the operation of every single function and dimension such that I can immediately see my folly with a mere stare? I shall attempt do so, in the meanwhile I can only hope for aid also.

Did you look at the logic in relu_backward to see how it works? It is in the file dnn_utils.py, which you can access by clicking “File → Open” and then opening that file. Of course the general principle is that just because you didn’t write that code does not mean the problem is not your fault. A perfectly correct subroutine will throw errors if you pass it bad arguments.

So that error means that the dA value and the activation cache value that you are passing to relu_backward do not match. They should be the same shape. So what shapes are they? Which one is correct? Why is the other one not the same? This is how debugging works: one step at a time. You start from the point of the failure. What does the error mean? Then work your way backwards one step at a time until you find the actual source of the problem.

Another good way to approach any kind of dimension mismatch is to work out the “dimensional analysis”: what are the sizes of all the given objects and then what should the shapes of the A, W and b values be at each layer. Then add prints to show the layer numbers and the shapes of everything and watch how it plays out. At some point, your logic goes off the rails and that tells you where to look. Here’s a thread that shows how to do dimensional analysis for a different test case. You should apply those methods to the test case here and see what you get.


Thank you utmost, I perused all of W4’s notes and made a summary, then I walked through every single function including the ones in dnn.utils.py, the problem was not a catastrophic mistake at all, I had merely forgotten to offset the indices (plural for index I hope?) of a block of code that I had copied from the commented out section some lines above. Spending so much time on this one mistake has given me many valuable lessons, among which I would count methodological debugging with additional perusing of all materials as well as a simple double checking of simple parameters being the most effective protocols in dealing with such bugs. The problem is resolved.

It’s great news that you were able to solve the problem! It sounds like you also added some new skills your debugging repertoire, so future problems will be easier to handle. Onward! :nerd_face:

And Twinty has used some great artistic senses to explain the query followed by the solutions. That was really spectacular to read :slight_smile: