Prove that multiplication by random matrices recovers eigenvectors

80 Views Asked by At

First, let $\mathbf{A}$ be an $N\times N$ semidefinite (covariance) matrix and $\mathbf{X}$ be an $N\times M$ random matrix whose columns $\mathbf{x}_i$ are i.i.d. jointly Gaussian random vectors. We can compute the eigenvector / singular value decomposition

\begin{align} \mathbf{T\Lambda T}^\top = \mathbf{X}^\top\mathbf{A}\mathbf{X} \end{align}

and can subsequently define an $N\times M$ matrix \begin{align} \mathbf{P}=\mathbf{X}\mathbf{T}. \end{align}

I would like to show that the (column normalized) column vectors in $\mathbf{P}$ are the eigenvectors of $\mathbf{A}$ in the case where $\left<\mathbf{x}_i\mathbf{x}_j^\top\right>=\delta_{ij}$, where angle brackets denote expectation, and in the limit of large $M,N$.

Proposed solution: For large $M, N$, $\mathbf{X}\mathbf{X}^\top\rightarrow M\mathbf{I}$ and $\mathbf{X}^{\top}\mathbf{X}\rightarrow N\mathbf{I}$.

If I left- and right-multiply my first equation by $\mathbf{X}$ and $\mathbf{X}^\top$, respectively, and substitute, I obtain

\begin{align} \mathbf{X}\mathbf{T\Lambda T}^\top\mathbf{X}^\top &= \mathbf{X}\mathbf{X}^\top\mathbf{A}\mathbf{X}\mathbf{X}^\top\\ \implies \mathbf{P}\mathbf{\Lambda}\mathbf{P}^\top &= M^2\mathbf{A}. \tag{1}\label{1} \end{align}

Furthermore, I have that \begin{align} \mathbf{P}^\top\mathbf{P}&= \mathbf{T}^\top\mathbf{X}^\top\mathbf{X}\mathbf{T}\\ &=N\mathbf{I} \end{align}

so that the normalized matrix $\tilde{\mathbf{P}}=N^{-1/2}\mathbf{P}$ is orthonormal, where we have also used the orthonormality of $\mathbf{T}$. Substituting this into \eqref{1} and rearranging yields

\begin{align} \frac{N}{M^2}\tilde{\mathbf{P}}\mathbf{\Lambda}\tilde{\mathbf{P}}^\top &= \mathbf{A}. \end{align}

By uniqueness of the eigenvector decomposition, we can then conclude that $\tilde{\mathbf{P}}$ are the eigenvectors of $\mathbf{A}$. This seems fairly sensible except that $\mathbf{P}$ is not square...

1

There are 1 best solutions below

4
On

Too long for a comment.

  • What is the meaning of $\propto$ ?

  • If $\mathbf{X}$ is an $N\times M$ matrix of Gaussian variables should $\mathbf{T\Lambda T}^\top$ not be random ?

I guess you mean that the elements of $\mathbf{X}$ are numbers that were drawn independently from a standard Gaussian distribution.

If $Y$ is an $\mathbb R^N$-valued Gaussian random vector with covariance matrix $\mathbb E[YY^\top]=\mathbf I$ and $\mathbf A$ is another covariance matrix with eigen decomposition $$ \mathbf{Q\Gamma Q}^\top=\mathbf A $$ then it is fairly easy to see that the transformed random vector $$ Z=\mathbf Q\sqrt{\mathbf\Gamma} Y $$ has covariance matrix $$ \mathbb E[ZZ^\top]=\mathbb E[\mathbf Q\sqrt{\mathbf\Gamma}\,\,YY^\top\sqrt{\mathbf\Gamma}\mathbf Q^\top]=\mathbf Q\sqrt{\mathbf\Gamma}\,\,\underbrace{\mathbb E[YY^\top]}_{\mathbf I}\sqrt{\mathbf\Gamma}\mathbf Q^\top=\mathbf A\,. $$ When your $M$ is large your many columns of $\mathbf X$ are many realizations of the random vector $Y\,.$ By the law of large numbers, averages taken over the columns of $\mathbf X$ should be close to expectations. For example the covariance matrix $\mathbf I$ should be $$ \mathbb E[YY^\top]\approx\frac{1}{M}\mathbf{XX}^\top\,. $$ Likewise, the covariance matrix $\mathbf A$ should be $$ \mathbb E[ZZ^\top]\approx\frac{1}{M}\mathbf{Q\sqrt{\Gamma}XX}^\top\mathbf{\sqrt{\Gamma}Q}^\top\,. $$