Eigenvectors’ entries distribution of a random matrix taken from a GOE distribution?

211 Views Asked by At

Heads up: I’m a physicist.

I’ve read in this physics paper that the eigenvectors of a random matrix $V$, taken from the Gaussian Orthogonal Ensemble (GOE) that:

In relation to the eigenvectors, the main assertion is that the orthogonal matrix of eigenvectors $(\psi^1, · · · , \psi^N )$ is distributed according to the Haar measure on the orthog- onal group $O(N)$.

(I know nothing about Haar measures and what I read was really complicated for my maths level) Then the author continues to say

For our purposes, this means that the eigenvectors have independent and random gaussian entries, up to normalization. Matemathically: $$[\psi_i^a]=0 \quad[\psi^a_i\psi^b_j]= \frac{1}{N}\delta_{ij}\delta^{ab},$$ where $[p]$ denotes the average of the random variable $p$ over the matrix ensemble.

where $\psi^a_i$ is the $i$-th entry of eigenvector $\psi^a$.

Could somebody explain me how the author obtained those statistical moments above and what should one do to obtain higher moments (like $[\psi^a_i\psi^b_j\psi^c_k\psi^d_l]$)?


As it stands, to me it just looks as if $\psi^a_i$ was just some independent random variable and we were taking its expectation value with respect to a 1d Gaussian distribution (mean? deviation?...). Hence $[\psi_i^a]=0$ as it is like doing $[x]$ with a 1d Gaussian distribution. Similarly for $[\psi^a_i\psi^b_j]= \frac{1}{N}\delta_{ij}\delta^{ab}$ because unless $\psi^a_i=\psi^b_j$ we have the product of two “$[x]$” which we already know is zero.

But what about the $\frac{1}{N}$? I really need to understand why that $N$ is there as later on in the paper one sees higher moments with different powers of $N$ dividing through.