Assignment : E8

What is sigmoid_backward and relu_backward ? I may not be using them correctly.

I am getting the below error

``With sigmoid: dA_prev = [[-0.04591835 -0.00062194]
[-0.0394537 -0.00053438]
[ 0.02393479 0.00032418]]
With sigmoid: dW = [[-0.04587658 -0.03916771 0.01054726]]
With sigmoid: db = [[0.02199546]]
With relu: dA_prev = [[-0.18375266 0. ]
[-0.1578829 0. ]
[ 0.09578046 0. ]]
With relu: dW = [[-0.18551486 -0.15574832 0.04367201]]
With relu: db = [[0.08684355]]
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
4 Tests passed
2 Tests failed

AssertionError Traceback (most recent call last)
in
11 print("With relu: db = " + str(t_db))
12
—> 13 linear_activation_backward_test(linear_activation_backward)

~/work/release/W4A1/public_tests.py in linear_activation_backward_test(target)
467 ]
468
→ 469 multiple_test(test_cases, target)
470
471

~/work/release/W4A1/test_utils.py in multiple_test(test_cases, target)
140 print(‘\033[92m’, success," Tests passed")
141 print(‘\033[91m’, len(test_cases) - success, " Tests failed")
→ 142 raise AssertionError(“Not all tests were passed for {}. Check your equations and avoid using global variables inside the function.”.format(target.name))
143

AssertionError: Not all tests were passed for linear_activation_backward. Check your equations and avoid using global variables inside the function.`

If these don’t help, click my name and message your notebook as an arrachment.

The sigmoid_backward and relu_backward calculate the derivative of its respective activation function. You can check these codes in the dnn_utils file.

Regarding your error, double-check your code as this exercise is just calling the functions with correct arguments. Also, have you passed linear_backward?

1 Like

Now I got it right ! the sigmoid_backward and relu_backward finds the completed derivative of dz. I got confused with the function name I thought it was just g prime (Z[l])