L2 norm of Eigenvectors *centered data

When I calculated the v1 and V2 L2 norm of the eigenvetors I get a 2,55 matri. The formula then says to multiply by the centered data which is 55,4096. The test output expects a 55,2 output. The only way I can get these two Matrix to multiply is with @ and that gives me a 4096,2 result.

We don’t need to compute any norms in this section. It turns out that the function we call to compute the eigenvalues and corresponding eigenvectors returns the eigenvectors normalized to have length 1 (FWIW). But we don’t need to mess with that in this section.

I added some print statements in the perform_PCA function and here’s what I see:

X.shape (55, 4096)
eigenvecs.shape (4096, 55)
Xred.shape (55, 2)

The steps we need to do are:

Subset the eigenvector matrix by selecting out the first k columns, which gives the eigenvectors correspond to the k largest eigenvalues. So that will give us a matrix of shape (4096, k). If we then do the matrix multiply of X @ reduced_eigenvec that is (55, 4096) dot (4096, k) and we end up with … wait for it … (55, k).

Note that k = 2 in the particular test case here, but we don’t want to do any hard-coding in our implementation.

1 Like

again thanks for your feedback Paulin! These things always turn out to be so much simpler that I am making them in my head. Linear algebra is a new way of thinking to an old data step programmer…

his name is Paul not Paulin, it is Paul in palo Alto :slightly_smiling_face:

1 Like