DLS C1_W4_Assignment Ex. 8 - linear_activation_backward

Hello!

I’m stuck on Exercise 8 - linear_activation_backward (DLS, Course 1, Week 4 Assignment ‘Building_your_Deep_Neural_Network_Step_by_Step’)

The returned values of the test look like the expected output. However, there are still some errors.

What could have led to these errors?


Neural Networks and Deep Learning

With sigmoid: dA_prev = [[ 0.11017994  0.01105339]
 [ 0.09466817  0.00949723]
 [-0.05743092 -0.00576154]]
With sigmoid: dW = [[ 0.10266786  0.09778551 -0.01968084]]
With sigmoid: db = [[-0.05729622]]
With relu: dA_prev = [[ 0.44090989  0.        ]
 [ 0.37883606  0.        ]
 [-0.2298228   0.        ]]
With relu: dW = [[ 0.44513824  0.37371418 -0.10478989]]
With relu: db = [[-0.20837892]]
 All tests passed.

Here are my correct answers. Your answers for the sigmoid case look correct. The relu answers are all right except that the shapes are wrong. The db shape is wrong: it is 1 x 1 x 2 instead of 1 x 1. Notice that you’ve got 3 sets of brackets on the relu dA_prev value and dW value, so that is also wrong, but in a more subtle way. That means the shape is probably 1 x 3 x 2 instead of 3 x 2. So how did that happen?

Note that there is a different test that you can’t directly see, but you can examine it by opening the file public_tests.py. For that test, you fail the value tests. The 4 tests that you pass are probably the type and shape of the results, but the values are wrong. So in other words the 4 passed tests are a pretty low bar.

Start with the ReLU shape issue and also look for some kind of “hard-coding” or global variable issues if you mostly pass one test but then totally fail the other.

4 Likes

Thank you, @paulinpaloalto, for the detailed inspection of my issue. It was very helpful.