XGBoost vs Random Forest

In Decision trees,

Can someone help me to have guideline to pick random forest vs XGBoost ? Random forest strengthen the algorithm by ensembling different feature subsets and XGBoost esembling by resampling smartly training sample.

Where does Random Rorest or XGBoost shine specifically? Or, are they comparably good two methods in the practice without any preference?

Thanks,

Personally I would try Random Forest as baseline bc it is more stable and doesn’t need much tuning, then I’ll do XGBoost since it generally is better at accuracy