The entropy function looks like the sigmoid loss function to me
The name of sigmoid is cross entropy, what is the name entropy function (for me it is another loss function)
The entropy function looks like the sigmoid loss function to me
The name of sigmoid is cross entropy, what is the name entropy function (for me it is another loss function)
Also one more question, here probability formula is of naive probability, right?
Hi @tbhaxor
what do you mean specifically with sigmoid loss function? (The entropy profile you shared has nothing directly to do with the sigmoid activation or the context of logistic regression if this is the direction of your question | apart that the output of the sigmoid function in a logistic regression could be interpreted as probability estimate which then could serve as input for the Shannon entropy]. On an abstract level one can say: both, the sigmoid function and the Shannon entropy can be used to describe the uncertainty of a prediction resp. a probability.
Frankly, I am not aware of an official recognised name of this function for this entropy profile you were showing. Still I believe it’s important to understand since the Shannon entropy plays a crucial role in many applications, especially in AI.
Think of of random experiment - a coin toss:
(Here your definition of the 2nd post also suits!)
Hope that helps!
Best regards
Christian
For example here: the binary cross entropy loss function would correspond to your plot, since in the end it’s the same formula underlying (Shannon entropy definition):
Best regards
Christian
It not linked with sigmoid but it looks like that
Feel free to take a look at the sigmoid function:
Best regards
Christian