Week 3 compute_cross_entropy_cost

I know that we should use ‘from logit=true’ to transfer the logits into the ‘probability’.
But can I also use the tf.keras.activations.softmax to explicitly change the logits variable?
If using the former one, the calculated cost will be 0.810287.
While the latter case will give 0.81028694.
So are they the same thing? (Actually I use the latter case first but the result didn’t pass the check, and then I find this ‘from logit’ in this forum.)

In theory, both approaches you’ve suggested are valid.
In practice, from_logits way of doing things is encouraged due to performance reasons.

Look at the tolerance defaults of numpy.allclose to understand the range of acceptable values.