Need help, why I am receiving an Output error for A2, while executing function forward_propagation() from the expected output.
- Use the function
sigmoid()
. It’s built into (imported) this notebook.
- Use the function
np.tanh()
. It’s part of the numpy library.
- Implement using these steps:
- Retrieve each parameter from the dictionary “parameters” (which is the output of
initialize_parameters()
by using parameters[".."]
.
- Implement Forward Propagation. Compute 𝑍[1],𝐴[1],𝑍[2]
and 𝐴[2] (the vector of all your predictions on all the examples in the training set).
While following instructions are followed in the following function,
def forward_propagation(X, parameters)
…
return A2, cache
Output of A2 is not equal to expected output.
Expected output
A2 = [[0.21292656 0.21274673 0.21295976]]
All tests passed!
All the steps run successfully except assert function. Please assist.
Are you using np.dot
for Z^{[1]} and Z^{[2]}? Please share your full error.
The other thing to check is to make sure you used the activation functions correctly: the tanh is only at layer 1, right? Then we use sigmoid for the output layer. As Saif says, it would help to see the full error output you get.
Yes, I am using np.dot
{moderator edit - solution code removed}
Complete output:
A2 = [[0.49990298 0.50005261 0.49993933]]
AssertionError Traceback (most recent call last)
in
3 print("A2 = " + str(A2))
4
----> 5 forward_propagation_test(forward_propagation)
~/work/release/W3A1/public_tests.py in forward_propagation_test(target)
108 assert output[1][“Z2”].shape == expected_Z2.shape, f"Wrong shape for cache[‘Z2’]."
109
→ 110 assert np.allclose(output[0], expected_A2), “Wrong values for A2”
111 assert np.allclose(output[1][“Z1”], expected_Z1), “Wrong values for cache[‘Z1’]”
112 assert np.allclose(output[1][“A1”], expected_A1), “Wrong values for cache[‘A1’]”
AssertionError: Wrong values for A2
Expected output
A2 = [[0.21292656 0.21274673 0.21295976]]
All tests passed!
Yes, using correct activation function tanh for layer 1.
I have provided the full error output on this post.
Interesting. The code that you show all looks correct and there really aren’t very many moving parts there. But your answers for A2 are way off. I added logic to print all of Z1, A1, Z2 and A2 in my code and here’s what I see when I run that test cell for forward_propagation
:
Z1 = [[ 1.7386459 1.74687437 1.74830797]
[-0.81350569 -0.73394355 -0.78767559]
[ 0.29893918 0.32272601 0.34788465]
[-0.2278403 -0.2632236 -0.22336567]]
A1 = [[ 0.9400694 0.94101876 0.94118266]
[-0.67151964 -0.62547205 -0.65709025]
[ 0.29034152 0.31196971 0.33449821]
[-0.22397799 -0.25730819 -0.2197236 ]]
Z2 = [[-1.30737426 -1.30844761 -1.30717618]]
A2 = [[0.21292656 0.21274673 0.21295976]]
A2 = [[0.21292656 0.21274673 0.21295976]]
Z1 = [[ 1.7386459 1.74687437 1.74830797]
[-0.81350569 -0.73394355 -0.78767559]
[ 0.29893918 0.32272601 0.34788465]
[-0.2278403 -0.2632236 -0.22336567]]
A1 = [[ 0.9400694 0.94101876 0.94118266]
[-0.67151964 -0.62547205 -0.65709025]
[ 0.29034152 0.31196971 0.33449821]
[-0.22397799 -0.25730819 -0.2197236 ]]
Z2 = [[-1.30737426 -1.30844761 -1.30717618]]
A2 = [[0.21292656 0.21274673 0.21295976]]
All tests passed!
Do your Z1 and A1 values match? The only thing I can think of is that maybe you are not using the correct input values for the W and b matrices.
Note that the same test case gets repeated by the test function that is being called there, so you see everything twice. Why it makes sense to have two tests be the same is a different question.