Eigenvalues of a mean-zero symmetric random matrix

254 Views Asked by At

Maybe this is too obvious, but I what to be sure... Let $Y$ be a $p\times p$ symmetric random matrix (i.e. you can think about $Y$ as a matrix with random entries). Define $E[Y]$, the expectation of $Y$, as the matrix with entries $(E[Y])_{ij} = E[Y_{ij}]$. I think that the next affirmation is true:

If $E[Y] = 0_{p\times p}$ then $\lambda_{\max}(Y)\geq 0$ a.s., where $\lambda_{\max}(Y)$ is the greatest eigenvalue of $Y$ (which is real since $Y$ is symmetric).

My argument is as follows. Suppose that all the eigenvalues are negative. Then $tr(Y)<0$, which implies that $E[tr(Y)]<0$ and $tr(E[Y])<0$. This is a contradiction since $E[Y] = 0_{p\times p}$. Then there exist at least one non-negative eigenvalue, one of which is $\lambda_{\max}(Y)$.

Is my argument correct? In that case, is there a generalization of this result?

2

There are 2 best solutions below

2
On

Consider $p=1$ and $Y$ equal to the 1x1 matrix 1 with probability 1/2 or the 1x1 matrix -1 with probability 1/2.

7
On

To add to the existing Ian's answer, the mistake in your proof is that $tr(Y) < 0$ does not imply that $\mathbb{E}[tr(Y)] < 0$.

The reason for this is that $Y$ is some realization of the random matrix, which may be anything within the allowed range. Consider, for example, restricting $Y$ to be a $2 \times 2$ diagonal matrix with entries $u,v$. Then, $\mathbb{E}[tr(Y)] = \mathbb{E}[u+v] = 0$, but for any particular matrix, $tr(Y) = u+v$ and both $u,v$ may end up negative in any particular realization.


For another example, consider a uniform random variable $A \sim \mathcal{U}[-1,1]$. Clearly, $\mathbb{E}[A] = 0$, but if I take a sample from this distribution, generating a sequence $A_1, A_2, \ldots$, a good number of them will be negative.