Bounding the eigenvalues of a matrix projected onto a random subspace

543 Views Asked by At

Let $\Sigma$ be a fixed $m\times m$ positive semidefinite matrix, and $X$ be an $m\times n$ matrix (with $n<m$) whose columns are random vectors drawn uniformly from the unit sphere. I need tail bounds on the trace and operator norms of $XX^+\Sigma$ (where $XX^+$ is the orthogonal projection onto the column space of $X$). My guess is that these quantities look roughly like $\text{tr}(XX^+\Sigma) \approx \frac{n}{m}\text{tr}\Sigma$.

So far, I've looked at a few of texts on random matrices. But I'm having trouble converting the matrix $XX^+\Sigma$ into one of the standard forms these books discuss.

1

There are 1 best solutions below

0
On

The columns of $X$ are in $\mathbb{R}^m$ and $\Sigma$ is symmetric $\geq 0$ over $\mathbb{R}^m$. Then, up to an orthonormal change of basis in $\mathbb{R}^m$, we may assume that $\Sigma=diag((\lambda_i)_i)$ where $\lambda_i\geq 0$. Moreover $tr(XX^+\Sigma)$ is linear in $\Sigma$; then it suffices to study the case $\Sigma=diag(1,0,\cdots,0)$; one has $tr(XX^+\Sigma)=(XX^+)_{11}$.

Since $tr(XX^+)=tr(X^+X)=tr(I_n)=n$, on has $E((XX^+)_{1,1})=\dfrac{1}{m}tr(XX^+)=\dfrac{n}{m}$.

Conclusion. By linearity, $E(tr(XX^+\Sigma))=\dfrac{n}{m}tr(\Sigma)$.

In fact, the hypothesis about the columns of $X$ is useless. We assume only that the $(x_{i,j})$ are independently uniformly chosen in the same interval $[-a,a]$; that implies that the diagonal entries $(XX^+)_{i,i}$ follow the same law. More precisely, $E(XX^+)=\dfrac{n}{m}I_m$.