I want to prove that these two definitions are equivalent:
- $\xi$ is Gaussian random vector if $\varphi_\xi(x) = e^{i\langle a,\: x\rangle - \frac{1}{2} \langle\Sigma x,\: x\rangle }$ where $\Sigma$ is $n \times n$ symmetric matrix positive semi-definite matrix and $a \in \mathbb{R}^n$
- $\xi = A\eta + b$, where $\eta = (\eta_1,\:\ldots,\: \eta_m)^T,\:\: \eta_i \sim N(0,\: 1)$, $A$ is $n \times m$ matrix and $b \in \mathbb{R}^n$.
When I prove that $1) \Rightarrow 2)$ i use the fact that $\Sigma = R^T \times D \times R$ where $R$ is orthogonal and $D$ is diagonal with non-zero elements on diagonal. If i know that $\sqrt{D} R $ has inverse matrix ( if $\det D \neq 0$ than i can prove it easily, considering $y := (\sqrt{D} R)^{-1}x$ but what should i do if $D$ has zeros on main diagonal?
First thing I will use is that if $\varphi_\eta(t)$ is the characteristic function of $\eta$, and if $\xi = Ax + b$ is a linear transformation, then the characteristic function for $\xi$ is given by $$ \begin{align} \varphi_\xi(t) &=\exp(it^Tb)\varphi_\eta(A^Tt) \\ &= e^{it^Tb}e^{-\frac{1}{2}(A^Tt)^TI(A^Tt)} \\ &=e^{it^Tb-\frac{1}{2}t^TAA^Tt}. \end{align} $$ Which is what you would get if you calculated the first two moments $$ \mathbb{E}\left[ \xi \right] = b, $$ and $$ \mathbb{E}\left[ \left(\xi - \bar{\xi}\right)\left(\xi-\bar{\xi}\right)^T\right] = AA^T. $$ Note that $AA^T$ is symmetric and so has real eigenvalues.