Gaussian matrix integration

731 Views Asked by At

Consider a random hermitian matrix $B$ of size $N\times N$ with Gaussian probability measure given by

$$ dx(B) = e^{-\frac{N}{2}Tr(B^2)}\prod_{i=1}^N dB_{ii} \prod_{i<j} d\Re(dB_{ij})d\Im(dB_{ij}) $$ where $B_{ii}, \Re(B_{ij}), \Im(B_{ij})$ are independent Gaussian random variables.

How can we prove the following integral? I am looking for a detailed proof that involves reducing it to the usual Gaussian Matrix integral that we can do, thanks.

$$ \int dx(B) = 2^N \left(\frac{\pi }{N}\right)^{\frac{N^2}{2}} $$

2

There are 2 best solutions below

6
On BEST ANSWER

We have $Tr(B^2)=\sum_i\sum_j B_{ij}B_{ji}=\sum_i\sum_j \mid B_{ij}\mid^2=\sum_i B_{ii}^2+2\sum_{i<j}Re(B_{ij})^2+Im(B_{ij})^2$. Thus the integrand becomes $\prod_i e^{-nx_{ii}^2/2}dx_{ii}\prod_{i<j}e^{-2nr_{ij}^2/2}dr_{ij} e^{-2nI_{ij}^2/2}dI_{ij}$, by independence of the components (I used a different notation for the dummy variables just for simplicity). Then, each term in the first product is $\int e^{-nx^2/2}dx=(2\pi/n)^{1/2}$, while each term in the second is $((\pi/n)^{1/2})^2$. There are N terms in the first product and $N(N-1)/2$ in the second, so we get $(2\pi/n)^{n/2}(\pi/n)^{n(n-1)/2}=2^{n/2}(\pi/n)^{n^2/2}$

Edit: I also think it is incorrect to say that $dB_{ii}$ is a gaussian measure. If you assume the entries are jointly distributed according to dx (normalized to 1), then it is true they are independent gaussians, but in the equation defining dx, the measure $dB_{ii}$ is just Lebesgue measure on the line (try to replace it with a gaussian measure and you will get a different answer)

1
On

Your question involves a fairly standard suite of arguments in random matrices literature so I will provide you references which you can use. I don't know if there's a very short and elementary proof of the result that is not involving the steps i describe below

The first step is to reduce degrees of freedoms: in the Hermitian ensemble, any hermitian matrix $B=UDU^*$ with $U$ a unitary matrix and $D$ the eigenvalue matrix where $U$ and $D$ are independent. The change of variable $B\rightarrow(U,D)$ can be worked out in detail and reads

$$\Pi_{ij} d B_{ij} = VDM(\lambda_1,\ldots,\lambda_n)^2 \Pi_{i=1}^nd\lambda_i d\Omega $$

where $VDM$ is the vandermonde of the eigenvalues $\lambda_i$ and $d\Omega$ is the Haar measure on the unitary group. The proof is not straightforward and details are provided here

Lectures on Random Matrices

Now the particular integral with gaussian measure defines what is called the Gaussian Unitary Ensemble because the integrand is invariant under the unitary transformation so the integral to be evaluated becomes

$$\int \Pi_{i=1}^nd\lambda_i \exp\left(-\frac{N}{2}\sum_i\lambda_i^2\right)\Pi_{i<j}(\lambda_i-\lambda_j)^2$$

The next step to of the computation is to project the vandermonde polynomials onto the basis of orthogonal polynomials with respect to the gaussian measure, aka Hermite polynomials in this case.

This is worked out in details in the following 2 papers where the expected value of the product of 2 vandermonde is explained

Orthogonal polynomial ensembles in probability theory

Dimers and orthogonal polynomials: connections with random matrices