Multiclass classification using tree ensembles (random forest, XGBoost, et cetera)

Just finished course 2 in the ML specialization. In week 4 I must have missed it, but was there a section or at least a mention of using tree ensembles (random forest, XGBoost, et cetera) for multiclass classification problems? To be clear, I’m thinking of a situation where you have many features for the inputs, and a non-binary output, so say, 4 possible classes that any one data point could be (cat, dog, mouse, chipmunk, instead of just cat/dog).

I imagine you are able to apply these algorithms to this type of problem, if so, then is the output a probability of the classification (like it was in neural networks and logistic regressions), or is it just a classification with no probability?

Hello @naveadjensen,

Those ensemble methods you mentioned are all capable of multiclass classification. In XGBoost, for example, you may set the objective parameter to multi:softmax so that the trained model will produce probabilities for the classes.