Exercise 8 - linear_activation_backward

Thks for helping

It it would be good when you post to mention the week and the assignment.

Can you check in the t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “relu”) the t_linear_activation_cache it should be a tuple with 2 elements, Thats what the error points out.

1 Like

Can you tell me how did you get to the conclusion that is a "tuple error that should be composed of two elements " from the screenshot?

Because I can see the previous functions and from the error, this error comes up when a tuple is involved!

But the previous function worked well… It didn’t report any errors’ shape.
What did I call in the linear_activation_backward that could have changed the numbers of elements inside the linear_activation_cache?
And if You see the linear_backward function do not You see also where it could stem from this tuple dimension problem?
And more, if I had to debug by myself how am I supposed to find a track to the solution from:
TypeError: cannot unpack non-iterable NoneType object
Anyway I didn’t get what can I do to solve this problem.
A clear answer would be really appreciated
Thks for your efforts

What I would do first thing is to check what is the type t_linear_activation_cache? If it comes out as a tuple then that part is OK, and then you will check your function implementation if its right or not, i.e. accepting tuples or not! That’s what I would do.

What it’s stored in the cache is a tuple…
And ( t_linear_activation) is a tuple …even if I do not understand how it can give a sensible result when asked to print the type of a variable declared after where am I introducing code.
Anyway, I’m still at the same point… I do not understand which is your point and how I can match it with the message error it gave me
If I store values in tuples before the linear_activation_backward function why now should not be more accepted ?
I renew my kind request for a clear answer

Hello @Luca_De_Renzo!

Check that your output With sigmoid is the same as expected. However, there is something wrong in relu case. Input arguments in both cases are the same but we just need to call relu_backward in the case of relu, right? So, how you are getting this error? Compare your implementation of relu with sigmoid and check how different they are.


1 Like

My analysis of the original error is that it’s not about the cache. It’s that the return value of linear_activation_backwards must be None. So you must have modified the template code that they gave you in such a way that the return statement is no longer part of the function. Did you “outdent” that line? You can’t just mess with the indentation in python: it’s a key part of the syntax.


With sigmoid: A = [[0.96890023 0.11013289]]
With ReLU: A = [[3.43896131 0. ]]
All tests passed.
Expected output
With sigmoid: A = [[0.96890023 0.11013289]]
With ReLU: A = [[3.43896131 0. ]]

I 'm not sure I get the point- if you’re talking about the:
A, activation_cache = relu(Z) and A, activation_cache = sigmoid(Z)
they were given to us

Yes I did it but only because it was giving me an error of indentation as you said

And now even if I want to change indentation it gives me:
name error: ‘linear_backward’ is not defined

I am referring to this code where input arguments are the same in both cases but functions are different:

    if activation == "relu":
        #(≈ 2 lines of code)
        # dZ =  ...
        # dA_prev, dW, db =  ...
    elif activation == "sigmoid":
        #(≈ 2 lines of code)
        # dZ =  ...
        # dA_prev, dW, db =  ...

But as you said, you change the indentation, so, I guess that might be the cause of the error. Which one gives you an indentation error?

You have to run all the above cells. Every time you open the assignment, you have to run all the above cells.

I’ m sorry to have caused so much efforts and time lost from all of you, but I swear to God that it was giving me an error of indentation ad as I changed it disappeared…

I realized what happened , because I tried to do the same today
Each time, To get to the right answer, I try 1 line of code at a time and I run it to see which kind of modification I might add to the code from the error response I receive
So when there is more than a piece of code required to fully accomplish the task as in this case; running part of the code gives an indentation error, and without realizing I thought was a mistake and I outdented.
It won’t happen again… beg Your pardon

It’s good news that you now understand what was really happening: the lesson is learned and you won’t have this problem in the future. Thanks for explaining and letting us know.