Independent representation of correlated $N(0,1)$ variables

74 Views Asked by At

Assume that $X_1$ and $X_2$ are correlated $N(0,1)$ variables. Now we can write \begin{align*} (X_1,X_2)^{T}=(\tilde{X_1},\gamma \tilde{X_1}+\sqrt{1-\gamma^2}\tilde{X_2})^{T} \end{align*}where $\tilde{X_1}$ and $\tilde{X_2}$ are independent $N(0,1)$ and $\gamma:=\operatorname{Cov}(X_1,X_2)$. I assume the same is possible for a longer vector. Does anybody know the general representation in terms of independent variables for vectors with $n$ components? This would be really helpful.

Thanks a lot.

1

There are 1 best solutions below

0
On BEST ANSWER

Assume without loss of generality that the vector $X$ is centered and consider the nonnegative definite covariance matrix $C=E[XX^T]$. There exists a square matrix $L$ such that $LL^T=C$. Then, $X=LY$ where $Y$ is standard gaussian, that is, the entries of $Y$ are i.i.d. standard normal random variables.

Example: If $C=\begin{pmatrix}1 & \gamma \\ \gamma & 1\end{pmatrix}$, then a solution is $L=\begin{pmatrix}1 & 0 \\ \gamma & \delta\end{pmatrix}$ with $\delta=\sqrt{1-\gamma^2}$, which yields $$\begin{pmatrix}X_1 \\ X_2\end{pmatrix}=LY=\begin{pmatrix}Y_1 \\ \gamma Y_1+ \delta Y_2\end{pmatrix}.$$

See the WP page on the Cholesky decomposition and related algorithms.