XGBoost : random feature choice?

XGBoost video :

The random Forest (RF) and XGBoost are both decision trees based models, ok no problem.

The RF uses random sampling with replacement + random feature choice, ok no problem.

The XGBoost uses a kind of random sampling where different weights are assigned to different train examples, ok no problem. But what is going on with the random features choices ? Is it still the case ? Could someone explain a bit more about the undergoing process involved here to get a better intuition, please ?

Thank you, all best !