Entropy function and logistic lost

Hello everyone,
Is anybody understand why the entropy function looks like the logistics lost ?
Andrew doesn’t explain why in the course,
I would alos appreciate if someone can recommend me a course/book where I can learn about advanced maths concept in ML.
Thanks in advance :slightly_smiling_face:

Keep learning
Leo

Hello Leo,

I am going to suggest some readings for you and hopefully they will help you understand it.

To be more exact, we say the cross- entropy function (instead of entropy function) looks like the logistic loss. In fact, the cross-entropy function is the logistic loss.

We can write this relation for the distributions of p and q: cross-entropy of p and q = entropy of p + divergence of p from q. My purpose of showing this relation is for us to see how much cross-entropy and entropy are not alike: only cross-entropy speaks about 2 distributions.

Since the cross-entropy function IS the logistic loss, if a mathematical deviation of logistic loss is sufficient for you as an explanation, then you may read the “A Simple Box Model” section of this link. Besides you can see how the cross-entropy is derived from simple ideas, another key element you want to grasp is the idea of Maximum Likelihood. Please be reminded that our best trained model is nothing more than maximizing the likelihood for the trained model to observe the training data.

If you read this wikipedia page carefully, you will find that the cost function we are optimizing in logistic regression is the sum of the logistic loss of all samples, or the sum of cross entropy between p and q of all samples, where p is the true probability distribution and q is the prediction propability distribution. If q = p, the divergence of p from q is 0, and the cross entropy of p and q is equal to the entropy of p. I suggest you to read the wikipedia page.

Lastly, if you want some discussions about the form of the entropy function itself, you may read this which views entropy from “information” perspective and from “counting micro-states (thermodynamics)” perspective.

I hope someone can do this, and you will help them if you be more specific about what maths you don’t understand or what you are trying to understand. If you just want a list of any ML-related maths books for any ML topics that you might or might not be interested, you might use google and skip any books that are basic to you.

Hi Raymond
Thank you for your answer I will follow the links to better understand this concept

You are welcome Leo! This topic asks about book too, so I wonder if you want to read it.