Any quantitative methods of eigenvalues and matrix compression?

81 Views Asked by At

I have recently been studying how to use eigenvalues and eigenvectors to compress matrices. Specifically, we have a positive semi-definite matrix $A$, which can be diagonized by an orthogonal similarity transformation $A = PDP^{-1}$.

Assume that the eigenvalues in the diagonal matrix $D$ are sorted in descending order from top-left to bottom-right. I have found that by retaining all of the information in matrix $D$, keeping only the first $k$ columns of matrix $P$ (let it be $P_k$), and retaining only the first k rows of matrix $P^{-1}$ (let it be $P^{-1}_k$), $P_kDP^{-1}_k$ becomes increasingly close to $A$ as $k$ increases.

Specifically, if we define

$$a_k = \|P_kDP^{-1}_{k} - A\|_2$$

then $a_k$ is monotonically decreasing (just by observation).

I am curious if there is any relevant theory to provide an error analysis. Thank you!