Course 1 Week 4 Assignment 1 Exercise 8

There has been a post for this, BUT no one answered it. The OP just said she fixed it and that was it. Can someone help. I’ve spent too much time chasing bits.

Here is the issue. The cache that holds A_prev,dW,db is a tuple. It varies in size unless my eyes are tired from 10 +hrs of debugging this.
.My linear_backward(dZ,cache) passes all the tests.
I expected the cache to contain 3 items However when I execute linear_activation_backward(dA, cache, activation) I get an error.

in
1 t_dAL, t_linear_activation_cache = linear_activation_backward_test_case()
2
----> 3 t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “sigmoid”)
4 print("With sigmoid: dA_prev = " + str(t_dA_prev))
5 print("With sigmoid: dW = " + str(t_dW))

in linear_activation_backward(dA, cache, activation)
33 # YOUR CODE STARTS HERE
34 dZ = sigmoid_backward(dA, activation_cache)
—> 35 dA_prev,dW,db=linear_backward(dZ, cache)
36
37 # YOUR CODE ENDS HERE

in linear_backward(dZ, cache)
17 #W=cache[1]
18 #b = cache[2]
—> 19 A_prev,W,b=cache
20 print (“A_prev,W,b cache”, cache)
21

ValueError: not enough values to unpack (expected 3, got 2)

----- trying to unpack the tuple —

 16     A_prev=cache[0]
 17     W=cache[1]

—> 18 b = cache[2]
19 print (“A_prev,W,b cache”, cache)
20

IndexError: tuple index out of range

I think my error is to use cache[0] for the assignments instead of linear_backward(dZ, cache) but that causes
AssertionError: Not all tests were passed for linear_activation_backward
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
4 Tests passed
2 Tests failed
Any help would be appreciated.

Hi @Gian, I would check your implementation in the linear_activation_backward function. You see the error is in linear_backward but at that point the cache is already wrong:

There are two parts that could give you a hint of what’s wrong with your code:

From the function definition:

cache -- tuple of values (linear_cache, activation_cache) we store for computing backward propagation efficiently

And first line of the function:

linear_cache, activation_cache = cache
1 Like

Thanks Alberto. My dyslexia kicking in. I figured out the issue, I grok math , not so much implementations. Everything works once you know where to put the linear_cache as a parameter in the linear_backward(dZ,linear_cache) function call. Cheers!

1 Like

I’m glad you sorted it out :slight_smile:

Hello Gian,

I am facing the same issue.
If possible could you please share some hints which helps me in resolving the errors .

Regards,
Abhay

@Abhay25, I recommend you start a new thread and include a description of the issue and a screen capture image that shows any error messages.

This thread is three years old, the people involved in the thread are no longer active, and the course has been updated several times since then.

A new thread seems like a good idea.