Projection of Vector onto Matrix Top Spectrum

42 Views Asked by At

I'm looking for an efficient way to project a given $n$-dimensional vector $w$ onto specific spectrum part of a given matrix $A$.

Specifically, consider a positive-definite $n \times n$ matrix (actually it is Gramian) $A$ with eigen-decomposition $A = V \cdot \Lambda \cdot V^T$. Each column $v_i$ of $V$ is eigenvector of $A$, with its corresponding eigenvalue $\lambda_i = \Lambda_{ii}$, where $\lambda_1 \geq \lambda_2 \geq \lambda_3 \geq \ldots$.

Further, all $\{ v_i \}$ represent an orthonormal basis of $n$-dimensional vector space. Due to the last point, $w$ can be written as $w = \sum_{i = 1}^{n} a_i \cdot v_i$ where $a_i = <w, v_i>$. I'm looking for an efficient algorithm to compute $\hat{w} = \sum_{i = 1}^{k} a_i \cdot v_i$ where $k << n$. In other words, I want to project $w$ onto the subspace spanned by $k$ top eigenvectors of $A$.

Given $A$ and $w$, I can find $\hat{w}$ by obtaining $A$'s eigendecomposition/SVD. Yet, in my case $n$ is quite large and such decomposition is practically impossible. Is there any other way to compute $\hat{w}$? Any iterative algorithm? Otherwise, can $\hat{w}$ be approximated efficiently up to some epsilon?

More generally, is there a fast way to compute the projection $\sum_{i = k_1}^{k_2} a_i \cdot v_i$ for some $1 \leq k_1 < k_2 \leq n$?

Any suggestions?