There is a standard result when studying probability theory, taking any gaussian random variable $\mathcal{N}(\mu, C)$ we can write it as an affine transformation of the standard Gaussian $X$ as $AX+\mu$ with A a "square root" of the matrix C.
To my knowledge this can be proven in two ways, through moment generating functions (or characteristic functions), or bravely going through explicit calculations. This is obviously personal taste, but neither of these proofs satisfy me, hence I was wondering if any of you have different non-standard or original proofs?
This can be easily derived from the fact that if $X\sim\mathcal N(\mu,C)$, then for any matrix $A$ (with compatible shape) you have $AX\sim\mathcal N(A\mu,ACA^T)$.
Let $\sqrt C$ be any matrix whose square is $C$. If $C$ is invertible then by the above property $G=(\sqrt C)^{-1}(X-\mu)$ is a standard Gaussian such that $X=\sqrt C G+\mu$.
If $C$ is non-invertible the idea is the same but we must be a bit more careful as we cannot multiply by $(\sqrt C)^{-1}$. Write $C=PDP^T$ where $P$ is invertible and $D$ is diagonal with diagonal $(\lambda_1,\cdots,\lambda_p,0,\cdots,0)$ for some $\lambda_1,\cdots,\lambda_p>0$. Let $\sqrt D$ be the diagonal matrix with diagonal $(\sqrt\lambda_1,\cdots,\sqrt\lambda_p,0,\cdots,0)$. Then by the property mentioned above, $P^T(X-\mu)\sim\mathcal N(0,D)$, so it is equal to some vector $(\sqrt\lambda_1 Y_1,\cdots,\sqrt\lambda_pY_p,0,\cdots,0)$ where $Y_1,\cdots,Y_p$ are independent standard gaussian variables. Let $Y_{p+1},\cdots,Y_n$ (where $n$ is the size of $C$) be independent standard gaussian variables independent of $(Y_1,\cdots,Y_p)$, so that the vector $G=(Y_1,\cdots,Y_n)$ is a standard Gaussian and $(\sqrt\lambda_1 Y_1,\cdots,\sqrt\lambda_pY_p,0,\cdots,0)=\sqrt D G$. Then $X=P\sqrt D G+\mu$.