Given a matrix $\mathbf{A}\in\mathbb{R}^{N\times N}$ with $\mathbf{\Lambda_B}\in\mathbb{R}^{B\times B}$ as the diagonal matrix containing $B\ (B\ll N)$ eigenvalues of matrix $\mathbf{A}$. Is there any iterative algorithm that could obtain $\mathbf{V_B}\in\mathbb{R}^{N\times B}$ that satisfies the following equation, $$ \mathbf{AV_B} =\mathbf{V_B \Lambda_B} + \mathbf{Y_B}, $$ given the bias matrix $\mathbf{Y_B}\in \mathbb{R}^{N\times B}$?
The reason I'm looking for an iterative algorithm is that the matrix $\mathbf{A}$ cannot be stored explicitly, but we can calculate the matrix-vector multiplication, $\mathbf{Av}$.
Edit (additional note): The each column of $\mathbf{Y_B}$ is guaranteed to be perpendicular to the corresponding eigenvector of $\mathbf{A}$.