Week 4 - Building your deep neural network: step by step assignment - Error at Exercise 8

Hello. I am having trouble passing the test for linear_activation_backward function. From what I have seen in the forums, is that people tend to missmatch the sigmoid and relu function locations. However, I am sure that is not the case for me but I also get 4 correct 2 wrong answer output. Can you provide me a tip on why this can be the case for me?

Thank you!

With sigmoid: dA_prev = [[-0.04591835 -0.00062194]
 [-0.0394537  -0.00053438]
 [ 0.02393479  0.00032418]]
With sigmoid: dW = [[-0.04587658 -0.03916771  0.01054726]]
With sigmoid: db = [[0.02199546]]
With relu: dA_prev = [[-0.18375266  0.        ]
 [-0.1578829   0.        ]
 [ 0.09578046  0.        ]]
With relu: dW = [[-0.18551486 -0.15574832  0.04367201]]
With relu: db = [[0.08684355]]
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
 4  Tests passed
 2  Tests failed
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
<ipython-input-42-1dd7958789b5> in <module>
     11 print("With relu: db = " + str(t_db))
     12 
---> 13 linear_activation_backward_test(linear_activation_backward)

~/work/release/W4A1/public_tests.py in linear_activation_backward_test(target)
    378     ]
    379 
--> 380     multiple_test(test_cases, target)
    381 
    382 def L_model_backward_test(target):

~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
    140         print('\033[92m', success," Tests passed")
    141         print('\033[91m', len(test_cases) - success, " Tests failed")
--> 142         raise AssertionError("Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.".format(target.__name__))
    143 

AssertionError: Not all tests were passed for linear_activation_backward. Check your equations and avoid using global variables inside the function.

Notice that your all values either with sigmoid or with relu are wrong. As you said you don’t have the mismatch of sigmoid and relu, I assume you are telling the truth. OK. Tell me, have you passed all the previous exercise tests?

Remember that you have to pass all the previous tests before proceeding to the next one.

Hello. I do realise that all values are wrong, however it tells me that I have passed 4 tests, which does not make sense for me.

I have started the whole notebook again and passed all the tests. I assume that for exercise 8, for the if/else part where it says sigmoid and relu, I have to place the sigmoid and relu backward functions respectively, which I have done. For those relu and sigmoid functions, I have provided the activation cache value as the second parameter. For the linear backward function, I have provided the linear cache value. I have also checked to see if there are any missmatches for the previous exercises, eventhough they gave me the all ok output.

Test case doesn’t only check for values. It also checks for shape, type, etc. which you passed.

Please send me your this code in a direct message. Click my name and message.

Just to update others, @Ali_A_Arisoy was multiplying dZ with dA which led to the wrong values.

He misunderstood the below formula:

dZ^{[l]} = dA^{[l]} * g'(Z^{[l]})

Note that all these calculation is done by relu_backward and sigmoid_backward . You just need to pass the correct arguments.