When exploring the optional lab “Softmax function”, I found a statement like this: In the preferred organization the final layer has a linear activation, and for historical reasons, the outputs in this form are referred to as logits .

Then I googled the word “logit” to understand it further, and the explanation is that logit model is equivalent to logistic model.

So I’m a little confused. Why the “logit” stands for the output of a linear activation function rather than a sigmoid activation function?

The statement “logistic softmax automatic detection” and the math about how to deal with excessive exponential inspired me a lot.

But I’m still a little confused about the origin of the name of logit. I just don’t understand the naming logic. Why don’t we just use “from_linear” rather than “from_logits” to represent that the input to the loss function is just a linear activation? Or is there something I don’t understand clearly?

Setting from_logits=True for a tensorflow loss function means that the function expects for z. It does not extract. Please google for how people use it, and practice it yourself.

p is sigmoid(z). @farhana_hossain , please be explicit about the subject of your statement too, next time.