Logistic regression train error > 1?

I am training my model with ‘BinaryCrossentropy’ loss function. I would normally expect that the loss function is always less than 1

BUT

image_107

How is this possible?

1 Like

No the loss can be bigger than 1, but the accuracy cannot be bigger than 1.

It is not possible mathematically since the loss function is defined as follows:
image_108

Hey @popaqy,
I am unable to see why it is not possible mathematically? Since 0 <= y_i^{hat} <= 1, hence -\infty <= log(y_i^{hat}) <= 0. Now, if you put this range of values, the loss can be greater than 1. Am I missing something?

Cheers,
Elemento

1 Like

Try that calculation assuming that all the predictions are wrong.

1 Like

This article could be helpful:

Binary cross entropy

I was a bit confused as I was trying to make the connection between J train in decision tree algorithm and J train in logistic regression.

I first assumed that they calculate J train the same way but now I realized that I have to make a little bit calculations myself in order to be able to finally compare the two J train