Why sigmoid is used for binary classification in vgg16

While training vgg16 model on my cat dog data. we use Dense(1,activation = ‘sigmoid’). I want to ask why do we use sigmoid for binary classification and why not softmax activation

sigmoid is what you use for the binary case. It is the equivalent of softmax for the 2 class case.

1 Like

Hi @Rishabh_Jain3 !

The cats vs dogs is a ‘binary classification’ problem. We want to select one of 2 classes.

The sigmoid activation function is going to produce exactly that: a binary result.

The softmax activation is a generalization of the sigmoid, and for the specific case of a binary classification, using one or the other is at the end almost the same thing -I’d say this is a matter of preference.

And of course, if you were going to classify more than 2 classes, you’d pick softmax.

Ah! you beat me on the punch! :slight_smile:

@Juan_Olano sir the range of sigmoid is [0to1] then how can it give binary result.
Is there any threshold used in sigmoid. ie if the value is less than 0.5 then its class 0 else class 1. does it work like that?

Hi @Rishabh_Jain3 ,

" the range of sigmoid is [0to1]": That is correct, the domain of sigmoid from 0 to 1.

“Is there any threshold used in sigmoid. ie if the value is less than 0.5 then its class 0 else class 1.”: EXACTLY ! :slight_smile: you got this right. You define the threshold of, say, 0.5.

For instance:

For > 0.5 you call 1.
For <= 05. you call 0.

You could define the threshold at a different value, like 0.3 or 0.7, depending on your needs. In all exercises that you use sigmoid, you’ll always see the threshold.

Juan

Oh thankyou so much @Juan_Olano . Now my doubt is completely solved