Question on Dot product to project data

As I see in the video lecture, the operation X’ = XU[:, 0:2] is used to project the data on 2 principal components. Besides the dimensional correctness and the fact that the first 2 column vectors of U are the most “significant”, I still don’t get the idea why X is multiplied by the first 2 columns of U in that order to get the projection. When I read about SVD from other sources, they carry out the SVD A = USV^T, and then they do A’ = (US)(V^T[:, 0:2]) which is much clearer since they do it directly on the matrix A which needed to be reduced to 2 dimensions

Hi dungdao3112000,

USV^T is the SVD of the covariation matrix, not of the original data X the dimensions of which are to be reduced. To reduce the dimensions of X it needs to be multiplied with the eigenvectors in U[:,0:2]. See, e.g., this explanation.