Orthogonal transformation of Heteroskedastic matrices

27 Views Asked by At

Consider two $N \times N$ dimensional real matrices $A$ and $B$.

$A$ is a diagonal matrix with all non-zero elements taken from a real Gaussian distribution with mean $\mu = 0$ and variance $\sigma = \alpha$, where $\alpha$ is some positive real number.

$B$ is a symmetric matrix with all off-diagonal elements taken from a real Gaussian distribution with mean $\mu = 0$ and variance $\sigma = \beta$, where $\beta$ is again some positive real number. All the diagonal elements of $B$ are $0.$

Finally consider the sum of these two matrices $H = A + B$.

The matrix $H$ is heteroskedastic, since its' matrix elements have non-uniform variance.

The question is to find the variance of the diagonal and off-diagonal elements of $H$ after some arbitrary orthogonal transformation:

$H \rightarrow \tilde{H} = C^{T}H C$

where $C$ is an orthogonal matrix.

For the purpose of this calculation, the following can be assumed, if needed:

  1. We are working in the large $-N$ limit.
  2. The matrix elements of $C$ can also be chosen randomly, as long as $C$ remains orthogonal.

I am hoping there to be some general form of the variance of the diagonal and off-diagonal elements of $\tilde{H}$, in terms of $\alpha$ and $\beta$. So far I have not been able to come up with it.

Any help is appreciated! Thanks!