Course 1, Week 4, Exercise 8

Hi,

I’ve spent a considerable amount of time on exercise 9 but it seems I’m not making any progress :

With sigmoid: dA_prev = [[ 0.44090989  0.        ]
 [ 0.37883606  0.        ]
 [-0.2298228   0.        ]]
With sigmoid: dW = [[ 0.44513824  0.37371418 -0.10478989]]
With sigmoid: db = [[-0.20837892]]
With relu: dA_prev = [[ 0.11017994  0.01105339]
 [ 0.09466817  0.00949723]
 [-0.05743092 -0.00576154]]
With relu: dW = [[ 0.10266786  0.09778551 -0.01968084]]
With relu: db = [[-0.05729622]]
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
 4  Tests passed
 2  Tests failed
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-77-1dd7958789b5> in <module>
     11 print("With relu: db = " + str(t_db))
     12 
---> 13 linear_activation_backward_test(linear_activation_backward)

~/work/release/W4A1/public_tests.py in linear_activation_backward_test(target)
    378     ]
    379 
--> 380     multiple_test(test_cases, target)
    381 
    382 def L_model_backward_test(target):

~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
    140         print('\033[92m', success," Tests passed")
    141         print('\033[91m', len(test_cases) - success, " Tests failed")
--> 142         raise AssertionError("Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
    143 

AssertionError: Not all tests were passed for linear_activation_backward. Check your equations and avoid using global variables inside the function.

As far as I know, I have fed the activation cache to the sigmoid and relu backward functions. Then I took the resulting dZ and fed it through the linear backward alongside the linear cache containing the A, W, and b.
I’m still getting the error. Also my linear backward from exercise 7 is working correctly.
Is there something I’m missing??

Interesting. There really aren’t very many moving parts here and your description of how you wrote the code sounds correct to me. Here are the results I get in that test block:

With sigmoid: dA_prev = [[ 0.11017994  0.01105339]
 [ 0.09466817  0.00949723]
 [-0.05743092 -0.00576154]]
With sigmoid: dW = [[ 0.10266786  0.09778551 -0.01968084]]
With sigmoid: db = [[-0.05729622]]
With relu: dA_prev = [[ 0.44090989  0.        ]
 [ 0.37883606  0.        ]
 [-0.2298228   0.        ]]
With relu: dW = [[ 0.44513824  0.37371418 -0.10478989]]
With relu: db = [[-0.20837892]]
 All tests passed.

Oh, actually there’s a clue there: notice that your sigmoid results match my relu results. And vice versa. Hmmmmm. :nerd_face: :laughing:

2 Likes

Wow. Nice catch. I cannot believe I wasted 4 hours on this one :grinning:
Thanks again :pray: