Course 1 Week 4 Exercise 8 linear_activation_backward

I m having The following Issue . I m getting the Following Error in the linear activation backward. I m really not sure Where I m getting this error . The error spefically related to the unpacking of the cache variable

ValueError: not enough values to unpack (expected 3, got 2)

Following is the Code

{moderator edit - solution code removed}

The error which is coming is

Following is the previous function (linear_backward) output where all test cases passed

1 Like

You are passing the wrong cache value when you call linear_backward. It should be the linear cache, but you are passing the full cache entry. They already extracted the two separate cache entries for you in the template code, right?

4 Likes

Note that it is a general principle of debugging that the actual error may not be in the function that throws the exception. A perfectly correct subroutine can throw errors if you pass it incorrect arguments, which is what happened here. You have to track backwards from the point of the exception to figure out where the real problem is.

2 Likes

Thank You for responding I actually was beign impatient and didnt realized

I had the same problem! Thanks for posting the question and saving me a lot of trouble…!

May i get some hints here, what is wrong? I suppose the data from cache shall be read backwards but i do not know how to implement that.

With sigmoid: dA_prev = [[ 0.44090989  0.        ]
 [ 0.37883606  0.        ]
 [-0.2298228   0.        ]]
With sigmoid: dW = [[ 0.44513824  0.37371418 -0.10478989]]
With sigmoid: db = [[-0.20837892]]
With relu: dA_prev = [[ 0.11017994  0.01105339]
 [ 0.09466817  0.00949723]
 [-0.05743092 -0.00576154]]
With relu: dW = [[ 0.10266786  0.09778551 -0.01968084]]
With relu: db = [[-0.05729622]]
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
 4  Tests passed
 2  Tests failed
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-92-1dd7958789b5> in <module>
     11 print("With relu: db = " + str(t_db))
     12 
---> 13 linear_activation_backward_test(linear_activation_backward)

~/work/release/W4A1/public_tests.py in linear_activation_backward_test(target)
    378     ]
    379 
--> 380     multiple_test(test_cases, target)
    381 
    382 def L_model_backward_test(target):

~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
    140         print('\033[92m', success," Tests passed")
    141         print('\033[91m', len(test_cases) - success, " Tests failed")
--> 142         raise AssertionError("Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
    143 

AssertionError: Not all tests were passed for linear_activation_backward. Check your equations and avoid using global variables inside the function.

Expected output:

With sigmoid: dA_prev = [[ 0.11017994  0.01105339]
 [ 0.09466817  0.00949723]
 [-0.05743092 -0.00576154]]
With sigmoid: dW = [[ 0.10266786  0.09778551 -0.01968084]]
With sigmoid: db = [[-0.05729622]]
With relu: dA_prev = [[ 0.44090989  0.        ]
 [ 0.37883606  0.        ]
 [-0.2298228   0.        ]]
With relu: dW = [[ 0.44513824  0.37371418 -0.10478989]]
With relu: db = [[-0.20837892]]

Regards

The cache is passed in as an argument and they even give you the logic in the template code to extract the two components of the cache: the linear cache and the activation cache. Then you just have to pass the appropriate one to linear_backward and to relu_backward and sigmoid_backward.

Actually here’s a clue: compare your results to the expected values. You’ll notice that your values for the “with sigmoid” case agree with the expected values for the relu case. And vice versa. So you must have somehow got the logic backward in the two “if” clauses: you must have called relu_backward in the “sigmoid” case.

Ah, of course i called sigmoid in case of relu, and relu in case of sigmoid…
Without your clue i was just blind enough, thank you for the response.

1 Like

Hahaha, it happened the same to me! I was starting to become crazy :joy:
Thanks :slight_smile: