I'm trying to prove the following statement:
let $\textbf{X}=(X_1,...,X_n)$ a random vector in $(\mathbb{R}^n, \mathcal{B}^n)$. Then $X \in Col(C)+\mathbb{E}[\textbf{X}]$ almost surely, where $C$ is the covariance matrix of $\textbf{X}$ and $\mathbb{E}[\textbf{X}]$ is the expected value.
Here's how I should proceed:
if $C$ is invertible the thesis follows immediately, otherwise if $C$ is non invertible then $Ker(C)$ is not trivial.
Then I should initially prove the statement with $\mathbb{E}[\textbf{X}]=\bf{0}$ by showing that $\textbf{X}\in Col(C)=Col(C^T)=Row(C)=Ker(C)^{\perp}$, since $C$ is symmetric. I can't go on from here. I was thinking about using spectral theorem and orthogonal eigenspaces but I really don't know how to proceed.
Does anybody have any further hint to give me?
Thank you in advance!
So, I think I finally found the way to solve my problem.
Suppose $C$ is singular and $\mathbb{E}[\textbf{X}]=\bf{0}$. Then there exists $\textbf{v} \in \mathbb{R}^n$ such that $\bf{v}\neq\bf{0}$ and $C\bf{v}=\bf{0}$.
Then $ \textbf{v}^T C \textbf{v}=0 \implies \sum_i v_i \sum_j v_j C_{ij} =0 \implies Cov(\sum_i v_i X_i,\sum_j v_j X_j)=0 \implies Var((\sum_i v_i X_i)^2)=0 \implies \mathbb{E}[(\textbf{X}^T\textbf{v})^2]=0$
The last implication holds because $Var(\textbf{X}^T\textbf{v})=\mathbb{E}[(\textbf{X}^T\textbf{v})^2]-\mathbb{E}[\textbf{X}^T\textbf{v}]^2$ and because $\mathbb{E}[\textbf{X}^T\textbf{v}]=\mathbb{E}[\textbf{v}^T\textbf{X}]=\textbf{v}^T\mathbb{E}[\textbf{X}]=0$ by hypothesis.
Then follows $(\textbf{X}^T\textbf{v})^2 = 0$ almost surely, so $\textbf{X}^T\textbf{v}=0$ almost surely.
This shows that $\forall \textbf{v} \in Ker(C)$ $ \textbf{X}^T\textbf{v}=\textbf{0}$ almost surely $\implies \textbf{X} \in Ker(C)^{\perp}=Col(C)$ almost surely (since $Ker(C)$ is finite dimensional).
If otherwise $\mathbb{E}[\textbf{X}]\neq\bf{0}$ one can take $\textbf{Y}=\textbf{X}-\mathbb{E}[\textbf{X}]$, which has the same covariance matrix as $\bf{X}$ and expected value $\bf{0}$.