loss = sparse_categorical_crossentropy

Can you explain this?

Here we are defining the type of loss function to be used for training

Hi @GEMBALI_SAI_SANTOSH ,

I guess what you are want to know is why sometimes you find categorical_crossentropy and some other times sparse_categorical_crossentropy. The easy, simple, answer is: it depends if your labels are one-hot-encoded or not. If they are not, meaning your labels are single numbers, you need to use sparse_categorical_crossentropy as loss function.

1 Like

basically the easiest way to explain the difference between these 2 loss funtions: if You can apply an one-hot-encode (then u use a categorical_crossentropy loss) instead the other case where the for each target class the value is a number, but you cannot think about it as probability value. [0,1]