There has been a post for this, BUT no one answered it. The OP just said she fixed it and that was it. Can someone help. I’ve spent too much time chasing bits.
Here is the issue. The cache that holds A_prev,dW,db is a tuple. It varies in size unless my eyes are tired from 10 +hrs of debugging this.
.My linear_backward(dZ,cache) passes all the tests.
I expected the cache to contain 3 items However when I execute linear_activation_backward(dA, cache, activation) I get an error.
in
1 t_dAL, t_linear_activation_cache = linear_activation_backward_test_case()
2
----> 3 t_dA_prev, t_dW, t_db = linear_activation_backward(t_dAL, t_linear_activation_cache, activation = “sigmoid”)
4 print("With sigmoid: dA_prev = " + str(t_dA_prev))
5 print("With sigmoid: dW = " + str(t_dW))
in linear_activation_backward(dA, cache, activation)
33 # YOUR CODE STARTS HERE
34 dZ = sigmoid_backward(dA, activation_cache)
—> 35 dA_prev,dW,db=linear_backward(dZ, cache)
36
37 # YOUR CODE ENDS HERE
in linear_backward(dZ, cache)
17 #W=cache[1]
18 #b = cache[2]
—> 19 A_prev,W,b=cache
20 print (“A_prev,W,b cache”, cache)
21
ValueError: not enough values to unpack (expected 3, got 2)
----- trying to unpack the tuple —
16 A_prev=cache[0]
17 W=cache[1]
—> 18 b = cache[2]
19 print (“A_prev,W,b cache”, cache)
20
IndexError: tuple index out of range
I think my error is to use cache[0] for the assignments instead of linear_backward(dZ, cache) but that causes
AssertionError: Not all tests were passed for linear_activation_backward
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
4 Tests passed
2 Tests failed
Any help would be appreciated.