# W3_A1_Wrong value output for variable A2

@Mubsi
Greetings,

I have issues in the A2 calculation part of the assignment.

{moderator edit: code removed}
The above is my code.

While executing A2 I am getting
Wrong values for A2 error.

This is my fkjiiwdecmhl Lab ID

Regards,
Sudharshan.G

That code looks OK to me, except that posting your code isn’t allowed on the forum.

Perhaps the issue is in the sigmoid function (please do not post it here). Or maybe you modified something else that you were not supposed to.

When you have an error or an assert, please post a screen capture that shows the issue.

Yes, please show us the output when you run the test cell for the forward_propagation function. Here’s what I see:

A2 = [[0.21292656 0.21274673 0.21295976]]
All tests passed!


print(parameters)
print(f"t_X = {t_X}")
Z1 = cache["Z1"]
A1 = cache["A1"]
Z2 = cache["Z2"]
A2 = cache["A2"]
print(f"Z1 = {Z1}")
print(f"A1 = {A1}")
print(f"Z2 = {Z2}")
print(f"A2 = {A2}")


Here’s what I get from running that:

{'W1': array([[-0.00416758, -0.00056267],
[-0.02136196,  0.01640271],
[-0.01793436, -0.00841747],
[ 0.00502881, -0.01245288]]), 'W2': array([[-0.01057952, -0.00909008,  0.00551454,  0.02292208]]), 'b1': array([[ 1.74481176],
[-0.7612069 ],
[ 0.3190391 ],
[-0.24937038]]), 'b2': array([[-1.3]])}
t_X = [[ 1.62434536 -0.61175641 -0.52817175]
[-1.07296862  0.86540763 -2.3015387 ]]
Z1 = [[ 1.7386459   1.74687437  1.74830797]
[-0.81350569 -0.73394355 -0.78767559]
[ 0.29893918  0.32272601  0.34788465]
[-0.2278403  -0.2632236  -0.22336567]]
A1 = [[ 0.9400694   0.94101876  0.94118266]
[-0.67151964 -0.62547205 -0.65709025]
[ 0.29034152  0.31196971  0.33449821]
[-0.22397799 -0.25730819 -0.2197236 ]]
Z2 = [[-1.30737426 -1.30844761 -1.30717618]]
A2 = [[0.21292656 0.21274673 0.21295976]]


Please try that and maybe it will be a clue where your Z1, A1, Z2, A2 values go off the rails.

the sigmoid function is imported externally

This is the result

Ok, so did you compare your outputs to what I showed. What clues can you find there? Of course all the input values are the same, but you would expect that. But all the output values are different starting from Z1. But Z1 is just:

Z1 = W1 \cdot X + b1

and that’s what your code showed. So that’s not where the problem is. Maybe the earlier code that extracts the W1, b1 and so forth values is not correct and those values that were printed by the extra debug cell are not actually what you are using.

Oh, ok, I’ve got a theory: I’ll bet you call initialize_parameters within the forward_propagation function. That is a mistake. They give you the parameter values as an input to the function and you don’t need to generate them. You will see later when you write nn_model that all these functions are independent and called from nn_model. They did not tell you to do that in the instructions, right?

Note that the mentors cannot directly see your notebook, but I was able to get the exact same results you show by making the mistake I described above.

Then that is not where the problem lies.

and

is pass for me.

And I need to declare initialize parameters in the code block

I am getting Z1 with different values when performing np.dot(W1,X) + b1

I think not. The “parameters” are passed to the forward_propagation() function already.

If you call intialize_parameters() inside the forward_propagation() function, you are wiping out the learned weight values and setting them back to random initial conditions.

At some point you do need to call initialize_parameters(), but that’s in the nn_model() function.

Oh ok I got it now, I tried to run the parameters function again, I didn’t see that its already called for.

Now I am getting the results.

Thank you very much for your help.