For mutli-class problem, when we use ‘softmax’
Do we need to do pre-processing to make y starting from 0, then keep up by 1?
Or TF is smart enough to automatically categorize from Y training samples?
Thanks,
Lizhang
For mutli-class problem, when we use ‘softmax’
Do we need to do pre-processing to make y starting from 0, then keep up by 1?
Or TF is smart enough to automatically categorize from Y training samples?
Thanks,
Lizhang
Hi, @Lizhang_Qin!
When you use softmax for the model output, it will return values from 0 to 1. So, if your y is in integer format (say, class index) you have to transform it into one hot encoding for categorical cross entropy loss. However, you can use sparse categorical cross entropy
Thanks both for answering the questions.
Something to file away for the future: Softmax
is only relevant when you have multiple mutually exclusive classes. The documentation linked above calls these discrete classification tasks. Human and Horse. Dog and Cat. But what if you also have mammal? At some point you may need to handle a hierarchical or multi-label classification where instances belong to more than one of the classes. It’s a different problem and requires a different approach to loss and optimization. Not covered in these courses, but you should realize there are real world problems that require it (image annotation is one example).
Great to point out this ai_curious
.
Yes, softmax applied to mutli-class problem (labels are mutual exclusive).
Andrew mentioned in the course material the difference between multi-class and multi-label.
I guess the term label
is sometimes confusing.