Week 1, lab 2, counting labels and weighted loss


Could anyone explain how do we get these equations, please?
Obviously, we derive them from known equations, but I’m a little confused in the calculations.


Hi @Valeria_Tokareva

Welcome to the community!

This formula is that of logarithmic loss (aka log loss) or cross-entropy loss, which is a common evaluation metric for binary classification models such as logistic regression.

It tells you how close prediction probability is to the label (for instance to 0 or 1 in case of a binary classification into negative or positive classes). Having a lower loss of course implies better model performance.

Please look at any link on the topic of cross-entropy loss function.

One example I could find is section 5.5 of this pdf on Logistic Regression.

Hope this helps.

Best Regards.

– Jaidev

1 Like

Hello @getjaidev !

That’s exactly what I need :100:
Thank you so much!

1 Like