Show that $U^tWU \sim W$

63 Views Asked by At

A Wigner matrix is a symmetric matrix with i.i.d entries $W_{ji} = W_{ij} \sim \mathcal{N}(0,1)$ for $i < j$, and $W_{ii} \sim \mathcal{N}(0,2)$

The following statement was given without proof: for any orthogonal matrix $U$, it holds that $U^tWU \sim W$ - meaning that taking $U^tWU$ preserves the properties of $W$ as a Wigner matrix. What would be a good way to prove this?

I can see that $U^tWU$ is symmetric :

$$ (U^tWU)_{ij} = \underset{l}{\sum}\underset{k}{\sum} \text{u}_{kj}\text{w}_{kl}\text{u}_{li} = \underset{k}{\sum}\underset{l}{\sum} \text{u}_{kj}\text{w}_{lk}\text{u}_{li} = (U^tWU)_{ji} $$

And that the expectation remains zero :

$$\mathbb{E}[(U^tWU)_{ij}] = \mathbb{E}[\underset{l}{\sum}\underset{k}{\sum} \text{u}_{kj}\text{w}_{kl}\text{u}_{li}] = \underset{l}{\sum}\underset{k}{\sum} \text{u}_{kj}\text{u}_{li}\mathbb{E}[\text{w}_{kl}] = 0$$

But how can I show that the variances of the entries are still the same?

1

There are 1 best solutions below

1
On BEST ANSWER

These random matrices are also called the Gaussian Orthogonal Ensemble (GOE). Maybe the simple way to see the property that you mention is to use the following two facts.

  1. If $A$ is a $n \times n$ matrix made of i.i.d. $\mathcal{N}(0,1)$ random variables, then $W \sim \frac{1}{\sqrt{2}} (A + A^T)$.
  2. If $X \sim \mathcal{N}(\mu, \Sigma)$ is a Gaussian vector and $M$ a matrix, then $MX \sim \mathcal{N}(M \mu, M \Sigma M^T)$.

Now, take $A$ is as in 1. and $U$ an orthogonal matrix. Then the columns $(C_1, \dots, C_n)$ of $A$ are independent $\mathcal{N}(0, I)$, so the columns $(U C_1, \dots, U C_n)$ of $U A$ are independent with distribution $\mathcal{N}(U 0, U I U^T) = \mathcal{N}(0, I)$. In other words, $UA \sim A$. By symmetry, we also have $UAU^T \sim A$.

We can then compute that $$ UWU^T \sim \frac{1}{\sqrt{2}} (UAU^T + UA^TU^T) = \frac{1}{\sqrt{2}} (B + B^T), $$ where $B = U A U^T \sim A$, and thus $U W U^T \sim U$.