It is known that when a matrix is symmetric we can get its SVD $A = U \Sigma V^T$.
So the problem is reduced to compute the eigen-decomposition like
$A = W \Lambda W^T $.
The left singular vectors $u_i$ are $w_i$ and the right singular vectors $v_i$ are $\text{sign}(\lambda_i) w_i$. The singular values $\sigma_i$ are the magnitude of the eigen values $\lambda_i$.
So, if I compute $A= W \Lambda W^T $, how can I do PCA, as far as I get this I would take $\sigma_i = \sqrt{\lambda_i}$ to get eigenvalues, Is this correct? What else can you say about this?
For any matrix A you can get SVD: $A=U\Sigma{}V^T$. Here $U$ is eigenvector of $AA^T$ and $V$ is eigenvector of $A^TA$. You have $AA^T=U\Lambda{}U^T$ Then each element of $\Lambda$: $\lambda_i=\sigma_i^2$. You can simply prove by $AA^T=U\Sigma{}V^TV\Sigma^TU^T=U\Sigma{}^2U^T$.
If $A$ is your data matrix, each column is a data point, $A$ is not necessarily square, but $AA^T$ is square. If you preprocess your $A$ by $A=A-repmat(mean(A,2),1,size(A,2))$; then your $AA^T$ is the covariant matrix. Because $AA^T$ is positive semidefinite, all the eigenvalues are positive or zero. Take the eigenvectors corresponding to the largest eigenvalues, then you get the PCA projection matrix. Computing the eigenvectors of $AA^T$ is the same as computing the SVD of $A$, so you can just compute the SVD of $A=U\Sigma{}V^T$, and take the first p columns of U as your projection matrix. Project your data $A$ to $U_{(:,1:p)}^TA$.