Let $X=(X_1,\ldots,X_n)^T$ be a random vector with mean $m$ and covariance $C$. Assume $C$ is positive definite, so it has inverse $C^{-1}$. Let $C=A A^T$.
I saw there are mainly two approaches to define a Gaussian random vector. One uses $X=AY+m$ (where $Y=(Y_1,\ldots,Y_n)^T$ is a random vector with $Y_i$'s independent standard Gaussian random variables) as the definition, then its mean is $m$ and covariance is $A A^T\equiv C$. The other one says if a random vector has density $$\frac{1}{(2\pi)^\frac{n}{2}\sqrt{\det C}}\exp(-\frac{1}{2}((x-m)^T C^{-1}(x-m))$$ then it is a Gaussian vector with mean $m$ and $C$.
My question is: how can we deduce the density function from the first definition? My attempt is to first use $$F_X(x)=\Pr(X\le x)=\Pr(AY+m\le x)=\int_{y:Ay+m\le x}f_Y(y)\mathrm{d}y$$ the inequality above are all component-wise. Use the independence of $Y_i$, we can easily derive $f_Y(y)=\frac{1} {(2\pi)^\frac{n}{2}}e^{-\frac{1}{2}y^Ty}$. I'm stuck at the last step where I need to take derivative of the integration at $x$. Could someone tell me the approach to it? Thank you very much.
The mapping $T: y \mapsto Ay + m$ has inverse mapping $T^{-1}: x \mapsto A^{-1}(x - m)$, which has the Jacobian $$ J_{T^{-1}} = \text{det}(A^{-1}) = \dfrac{1}{\text{det}(A)} $$ Since $C = A A^T$, we must have $\vert \text{det}(A) \vert = \sqrt{\text{det}(C)}$, therefore $$ \vert J_{T^{-1}} \vert = \dfrac{1}{\sqrt{\text{det}(C)}} $$ By the multidimensional change of variable theorem, we have $$ f_X(x) = f_Y(T^{-1}(x)) \vert J_{T^{-1}} \vert = \dfrac{1}{\sqrt{\text{det}(C)}} f_Y(A^{-1}(x - m)) $$
The calculation of $f_Y$ is straightforward, since $Y_i$ are independent normal random variables: $$ f_Y(y) = \dfrac{1}{(2\pi)^{n/2}}\exp\left(-\dfrac{1}{2}y^T y\right) $$
Thus, $$ f_X(x) = \dfrac{1}{(2\pi)^{n/2}\sqrt{\text{det}(C)}}\exp\left(-\dfrac{1}{2}(x - m)^T (A^{-1})^T A^{-1}(x - m)\right) $$
Finally, notice that $(A^{-1})^T A^{-1} = (A A^T)^{-1} = C^{-1}$