Can someone please help me with this as I’ve gone through the forum and couldn’t find a way to solve it. I keep getting the following error when I run L_model_back FUNCTION.
I’m passing the right dA’s. For the hidden layers I’m passing dA[l+1] and dAL for the output layer
Are you sure that you didn’t modify the logic in linear_activation_backward that parses the current_cache into the linear and activation caches? That code was just given to you and doesn’t need to be modified.
If that’s not the problem, then the next step is to print the shapes of dA and the Z value in the linear_activation_backward logic and see if that gives you any clue. It’s a mess to do the debugging print in relu_backward, because that function is in a separate “utils” python file.
Note that we can tell the error is happening in one of the hidden layers, not the output layer from where the exception is thrown. Maybe also print the layer number in the loop to see that as well.
I have only added the calculation that was needed. Even the the test block for that function runs without any issues. Little bit confused I will try to check the shapes of dA and Z.
That looks fine. Then the error throws in the for loop. Which iteration of the loop and what the shapes? I added print statements to my L_model_backward and here’s what I see:
That’s not how indexing works in python, which I mentioned above. Watch this:
for ii in range(4):
print(f"ii = {ii}")
print(f"After loop ii = {ii}")
ii = 0
ii = 1
ii = 2
ii = 3
After loop ii = 3
for ii in reversed(range(5)):
print(f"ii = {ii}")
print(f"After loop ii = {ii}")
ii = 4
ii = 3
ii = 2
ii = 1
ii = 0
After loop ii = 0
But actually I would say that the code you showed looks correct to me. So this is a bit of a puzzle. Are you sure you really ran your new code? Note that if you just type new code and then run the test again without actually clicking “Shift-Enter” on the modified cell, then it simply runs the old code.