I´m having a hard time understanding the proof of theorem $14.3$ of the lecture notes of Gordan Zitkovic about multivariate Guassian vectors:
https://web.ma.utexas.edu/users/gordanz/notes/brownian_motion.pdf
In particular the implication $1\to 3$ :
Suppose we have a Gaussian Vector $X=(X_1,...,X_n)^T$ such that it´s characteristic function is $$\phi_x(t)=e^{it^T\mu - \frac{t^T\Sigma t}{2}}$$ where $\mu \in \mathbb{R^n}$ and $\Sigma \in \mathbb{M_{nxn}(\mathbb R)}$ is a symmetric nonnegative definite matrix with $rank(\Sigma) = d \in \{1,...,n\}$
We wish to prove that there exists a matrix $A\in \mathbb {M_{nxd}(\mathbb {R})}$ such that:
1) $rank(A) = d$
2) there exists a random vector $Y = (Y_1,...,Yd)^T$ defined in the same probability space as $X$ which consists of independent unit normals such that $$X = AY + \mu$$
Here is the proof (I have adapted some parts):
By replacing $X$ by $X − µ$, where $µ = E[X]$, we can assume, without loss of generality, that $E[X] = 0$. By nonnegative definitiness and symmetry we have that:$$ Q^T\Sigma Q = diag(\lambda_1,...\lambda_d,0,...0)$$ where $\lambda_1,...,\lambda_d > 0$ and $Q$ is a matrix whose columns form an orthonormal basis $\{e_1,....,e_n\}$ consisting of the eigenvectors of $\Sigma$. Let $T$ be the matrix whose columns are the first $d$ eigenvectors $\{e_1,...,e_d\}$. We have that: $$1) T^T \Sigma T = diag(\lambda_1,..., \lambda_d)$$ $$2) T^T T = I_{\mathbb R^{dxd}} \in M_{dxd}(\mathbb R) $$ (the identity matrix with respect to the canonical basis) $$3) T T^T \in M_{nxn}(\mathbb R)$$ such that $T T^Te_{i}=e_i$ for all $1\le i \le d$ and $T T^Te_{i}=(0,...,0)$ for all $d+1\le i \le n$. Let $Y = diag((\lambda_1)^{-1/2},...,(\lambda_d)^{-1/2}) T^T X$. If we compute the characteristic function of $Y$ we have that: $$ \phi_Y(t)=\phi_X(T diag((\lambda_1)^{-1/2},...,(\lambda_d)^{-1/2})t) = e^{-\frac{t^2_1+...+t^2_d}{2}}$$ which is the same of a characteristic function of $d$ independent unit normals. So $Y$ consist of $d$ independent unit normals. Now let $A=T diag((\lambda_1)^{1/2},...,(\lambda_d)^{1/2})$. Observe that $rank(A)=d$ and $AY=TT^TX$
It remains to show that $TT^T X=X$ almost surely :
On one hand we clearly have $|| TT^Tx|| \le ||x||$ for all $x \in \mathbb{R^n}$. On the other the random vectors $TT^TX, X$ have the same characteristic function, and are therefore equally distributed. Consequently $TT^TX=X$ almost surely.
I am having trouble with the last part: I understand that $TT^TX$ and $X$ have the same characteristic function. I just don´t see why :
1) $|| TT^Tx|| \le ||x||$ for all $x$ and
2) this implies that $TT^TX=X$ almost surely.
I´ve been stuck with this problem, and I just don´t see how to prove it.
Any ideas or suggestions would be highly appreciated.
$TT^{T}$ is an orthogonal projection because $\{e_{i}\}$ are orthonormal basis. Hence the first claim.
The event that $TT^{T}X=X$ is the same as the event that $X\in image(TT^{t})$ which is the image of the projection. Since $TT^{T}X$ is equally distributed with $X$ this has the same probability as the event that $TT^{T}X\in image(TT^{t})$, which is $1$.