C2_W2_SoftMax Why my_softmax output is different from tf.nn.softmax

Hello,
I have tried to apply the buit-in funtion my_softmax in the Output Handling session, instead the function used in the code (tf.nn.softmax(p_preferred).numpy()).
I was suposing it would have the same return but it doens’t. Why?

Output when running funtion tf.nn.softmax(p_preferred).numpy():

two example output vectors:
 [[4.89e-04 2.26e-03 9.84e-01 1.32e-02]
 [9.94e-01 6.07e-03 7.60e-05 2.23e-06]]
largest value 0.9999987 smallest value 3.6734e-11

Output when running funtion my_softmax(p_preferred):

two example output vectors:
 [[1.69e-08 7.83e-08 3.41e-05 4.55e-07]
 [2.16e-04 1.32e-06 1.65e-08 4.84e-10]]
largest value 0.099078394 smallest value 3.2865662e-13
1 Like

Welcome to the community.

It looks like the output from your (my_softmax) is not normalized, i.,e summation of all elements is not equal to 1. If it is normalized, then, the values seem to be exactly same as tf.nn.softmax.

Please double check.

Thank you for the welcome message and the reply :slight_smile:

I tested the following codes to see the outputs and sum:
my_softmax

test_z = np.array([10.0, 2.0, 3.0, 4.0])
return_softmax = my_softmax(test_z)
print("return: ", return_softmax)
print("sum of return ", np.sum(return_softmax))

That’s the result:

return:  [9.96e-01 3.34e-04 9.08e-04 2.47e-03]
sum of return  1.0000000000000002

tf.nn.softmax:

return_tf_softmax = tf.nn.softmax(test_z).numpy()
print("return: ", return_tf_softmax)
print("sum of return ", np.sum(return_tf_softmax))

Result:

return:  [9.96e-01 3.34e-04 9.08e-04 2.47e-03]
sum of return  1.0

So the sommation is very very close to 1, but not exactly 1. I don’t understand why it happens, but your answer makes sense.

1 Like

That’s essentially 1.0 with a little bit of numerical variance due to computational characteristics of the different implementations.