Week3,Exercise8

I’ve got every test above correct, but the cell tells me that the value of W1 is incorrect and the output of W1 remains the same after the gradient descent.
I can not find where the problem is. Please help!

The feedback of the cell is shown below.

AssertionError Traceback (most recent call last)
in
7 print("b2 = " + str(parameters[“b2”]))
8
----> 9 nn_model_test(nn_model)

~/work/release/W3A1/public_tests.py in nn_model_test(target)
273 assert output[“b2”].shape == expected_output[“b2”].shape, f"Wrong shape for b2."
274
→ 275 assert np.allclose(output[“W1”], expected_output[“W1”]), “Wrong values for W1”
276 assert np.allclose(output[“b1”], expected_output[“b1”]), “Wrong values for b1”
277 assert np.allclose(output[“W2”], expected_output[“W2”]), “Wrong values for W2”

AssertionError: Wrong values for W1

Hi @janliu99,

Can you share your lab ID with me ? In the assignment, when you click the top right “Help” button, a panel will open and your lab ID will be shown at the bottom.

I shall take a look.

Thanks,
Mubsi

Hi! I’ve found the problem through reviewing my codes! I accidentally had an typo in my codes, which made the program to have an assertion error!

It’s great that you were able to find the solution under your own power! Thank you for confirming. Onward! :nerd_face:

1 Like

Hi, mentor! I want to report a mistake in the instructions of Week 3, Exercise 9.
The first derivative of AL (dAL) should be ‘dAL = - (np.divide(Y, AL) + np.divide(1 - Y, 1 - AL))’, but the instructions said that ‘dAL = - (np.divide(Y, AL) - np.divide(1 - Y, 1 - AL))’.

The instructions are correct. You are forgetting the Chain Rule. It is all covered on this thread.

1 Like

In the thread you gave, the formula should be the picture I’ve captured above, which is the formula I’ve gave in the previous reply.
But in the instructions, it uses the “-” sign, which is a mistake.

No, it is correct. Check the parentheses carefully. The leading minus sign multiplies the entire expression. If you expand it out without the parens, you’ll see that you get the same result as shown in the image from the thread.