W3_A1_Ex-6_AssertionError_Getting wrong values

Exercise 6. Backward Propagation
Following errors are thrown eventhough the output values match with the expected values.

dW1 = [[ 0.00301023 -0.00747267]
[ 0.00257968 -0.00641288]
[-0.00156892 0.003893 ]
[-0.00652037 0.01618243]]
db1 = [[ 0.00176201]
[ 0.00150995]
[-0.00091736]
[-0.00381422]]
dW2 = [[ 0.00078841 0.01765429 -0.00084166 -0.01022527]]
db2 = [[-0.16655712]]

AssertionError Traceback (most recent call last)
in
7 print ("db2 = "+ str(grads[“db2”]))
8
----> 9 backward_propagation_test(backward_propagation)

~/work/release/W3A1/public_tests.py in backward_propagation_test(target)
187 assert output[“db2”].shape == expected_output[“db2”].shape, f"Wrong shape for db2."
188
→ 189 assert np.allclose(output[“dW1”], expected_output[“dW1”]), “Wrong values for dW1”
190 assert np.allclose(output[“db1”], expected_output[“db1”]), “Wrong values for db1”
191 assert np.allclose(output[“dW2”], expected_output[“dW2”]), “Wrong values for dW2”

AssertionError: Wrong values for dW1

Expected output

dW1 = [[ 0.00301023 -0.00747267]
[ 0.00257968 -0.00641288]
[-0.00156892 0.003893 ]
[-0.00652037 0.01618243]]
db1 = [[ 0.00176201]
[ 0.00150995]
[-0.00091736]
[-0.00381422]]
dW2 = [[ 0.00078841 0.01765429 -0.00084166 -0.01022527]]
db2 = [[-0.16655712]]
All tests passed!

It’s strange :confused: :thinking: Try Kernel → Restart and Clear output, and then Run all cells again. Let me know if it doesn’t help.

I tried once. it didnt work

I even created a new file using file open and entered all the user code required. It didnt work

Please send me your code in a private message. Click my name and message.

PS: I will be back in 2 hrs.

To update others, while there are multiple ways (mathematically correct) to compute the dZ1, it is recommended to follow the instruction given to you in a Notebook:
“So you can compute g^{[1]'}(Z^{[1]}) using (1 - np.power(A1, 2)).”

We are already working on why other mathematically correct ways give errors.

The same problem persists.

Please use A1instead of np.tanh(cache[“Z1”].

Worked with A1. Using (1 - np.power(A1, 2)) while calculating dZ1 .

Thank you. Exercise 6 is now passed.

Yes, there is a bug already filed about this, but I don’t think there’s much hope that it will be fixed anytime soon. It would involve changing the grader as well and apparently that is non-trivial to do.

Here’s another recent thread about this. And here’s another one.

But also note that they explicitly tell you to use the A1 method in the instructions and they even go so far as to actually write out the code for you.

1 Like