eigenvctors using PCA

44 Views Asked by At

I have Y the observations, and I have the eigenvalue matrix V . and I want to calculate the eigenevctors and the principals compenents using PCA with R. I'm trying $U=V * Y^T$ with V the eigenvectors matrix but i didn't get the result.

ps: I don't want to use the covariance matrix

1

There are 1 best solutions below

1
On BEST ANSWER

If you have the data matrix $X\in \mathbb{R}^{N\times d}$ where $N$ is the number of observations and $d$ is the data dimension, the principal components (transformed data) are $XV$ where $V\in\mathbb{R}^{d\times k}$ is the matrix of eigenvectors, i.e. $k$ eigenvectors are stored in the columns of $V$ with $k\leq d$. In R you can use X%*%V to compute the matrix product.

To get the eigenvectors, you can use prcomp or princomp in R, but these algorithms implicitly use the covariance matrix. You could use the onlinePCA package of Cardot & Degras, which computes only an eigenvector estimate but which does not require expensive computation of the covariance matrix. Unfortunately, the eigenvalues don't help you computing the eigenvectors of the covariance matrix, without explicitly computing the covariance matrix.

Maybe you could use something like stochastic gradient descent on $$(\lambda_i I - xx^T)v_i=0$$ where you sample $x$ to compute eigenvectors $v_i$

Alternatively, you can search for "recursive PCA" but I don't know any implemented package for it.