Course1-week4:Building your Deep Neural Network Step by Step-ex.9

Hi, I am stuck here, please help:

# current_cache = ...
# dA_prev_temp, dW_temp, db_temp = ...
# grads["dA" + str(L-1)] = ...
# grads["dW" + str(L)] = ...
# grads["db" + str(L)] = ..

Is it right to use L_model_forward() function? if it is, why am I getting this shape error?

I have also tried linear_activation_backward() , and get a weird error also.

Your second solution of calling linear_activation_backward is the way to go, but you need to get the arguments right. I suggest you read the instructions again and particularly pay attention to what they tell you in the comments. It is critical to keep track of which layer you are handling at a given point, so that you can get all the inputs correct.

Thank you, I got it right in the end, there were some small mistakes!
However, after reading it again and again, and watched all the courses, I am still very confused about the who L_model_backward.
One of the confusion is how to keep track of which layer I’m handling with?
Also, when to use L, when to use L-1? (I guess when I handle my own data, this would be a tricky part)

I find that it helps to just write out the “dimensional analysis” for all the layers before hand. Here’s a thread which gives an example of what I mean by that. Then the other thing you have to keep track of is how indexing works in python: it’s “0-based”. That applies to both indexing an array or the indices on for loops. Run the following loops and watch what happens:

for ii in range(1,5):
    print(f"ii = {ii}")

for ii in reversed(range(4)):
    print(f"ii = {ii}")