In the exercise “8 - linear_activation_backward” I believe I got it right. i watched the videos over and over and read some of the related topics. and still the output is wrong. I truly believe there might be a bug in it. All exercises previously turned out successful results.
A solution like so would yield wrong results:
{moderator edit - solution code removed}
yields wrong outputs:
With sigmoid: dA_prev = [[-0.04591835 -0.00062194]
[-0.0394537 -0.00053438]
[ 0.02393479 0.00032418]]
With sigmoid: dW = [[-0.04587658 -0.03916771 0.01054726]]
With sigmoid: db = [[0.02199546]]
With relu: dA_prev = [[-0.18375266 0. ]
[-0.1578829 0. ]
[ 0.09578046 0. ]]
With relu: dW = [[-0.18551486 -0.15574832 0.04367201]]
With relu: db = [[0.08684355]]
Error: Wrong output with sigmoid activation for variable 0.
Error: Wrong output with sigmoid activation for variable 1.
Error: Wrong output with sigmoid activation for variable 2.
Error: Wrong output with relu activation for variable 0.
Error: Wrong output with relu activation for variable 1.
Error: Wrong output with relu activation for variable 2.
4 Tests passed
2 Tests failed
Please help!