Decoupling $n$ Gaussian random variables using linear algebra

77 Views Asked by At

For a bivariate normal distribution $\begin{pmatrix} X \\ Y \end{pmatrix}$ with mean $\begin{pmatrix} \mu_X \\ \mu_Y\end{pmatrix}$ and variance $\begin{pmatrix} \sigma_X^2 & \rho \sigma_X \sigma_Y \\ \rho \sigma_X \sigma_Y & \sigma_Y^2\end{pmatrix}$, we can turn these into a pair of independent random variables by rewriting (for example) $Y$ as a linear function of $X$ so that the covariance will be $0$. So letting $Y = aX + Z$, we require $\text{cov}(X, Y - aX) = 0 \Rightarrow a = \frac{\sigma_X}{\sigma_Y}\rho$ and then $Z = Y - \frac{\sigma_X}{\sigma_Y}\rho X$ and $Y$ are uncorrelated. It's now easier to carry out calculations. To make things even nicer we can also standardise these variables.

I want to now extend this reasoning to higher dimensions, for example three Gaussian random variables in a multivariate Gaussian distribution.

  1. How can I rephrase my above argument in terms of matrices and linear algebra? Including the standardisation step.

  2. How can I use that rephrasing to extend this to general multivariate Gaussian?