I am working with a matrix A relatively large (200k x 200k), and I want to compute the eigenvectors of the Laplacian:
$L = D - A^2$, where $A$ is symmetric.
I don't need all eigenvectors, just a few (like 100). For the normalized Laplacian, I found out that I just need to compute the truncated SVD of $A D^{-1/2}$, because if $AD^{-1/2}$ = $USV^T$, then
$$D^{-1/2} A^2 D^{-1/2} = V S^2 V^{T}$$
and the columns of $V$ are eigenvectors of $L_{norm} = I - D^{-1/2} A^2 D^{-1/2}$. This worked, computing the SVD was pretty fast on this matrix.
However, I didn't find any way to compute the eigenvectors of $L$ (unnormalized) without computing $A^2$, and I can't compute this matrix because although $A$ is sparse, $A^2$ is not really sparse and the computation fails because of memory, although I have 256GB ram.
Is there a way to compute eigenvectors of $L = D - A^2$ with a similar trick, that doesn't require to compute $A^2$ ?
Thanks
Lanczos and randomized SVD are both able to compute a rank-$r$ truncated SVD based on application of the underlying matrix to $O(r)$ vectors. Neither require the actual matrix to be formed.