Proving a theorem about covariance matrix

117 Views Asked by At

I wonder if the following proposition is true :

$X$ a random vector of $\mathbb{R}^n$, with $E(||X||)<+\infty$,

if $det(D_{X})=0$

then $\exists \lambda_{1},...,\lambda_{n} \in \mathbb{R}^n$, such as $\displaystyle { \sum_{i=1}^{n}}\lambda_{i}(X_{i} - E(X_{i})) = 0$

with $D_{X}$ the covariance matrix of $X$

Thanks.

1

There are 1 best solutions below

5
On BEST ANSWER

Yes, it is true.

If you denote by $\bar X_i = X_i - E(X_i) \cdot 1$ then the covariance matrix is $$D_X= (\text{cov}(X_i, X_j))= (E(\bar X_i,\bar X_j))$$

Therefore $D_X$ is the Gram matrix associated to the elements $\bar X_i$. It is known that the Gram matrix associated to a family of vectors in a Hilbert space has zero determinant if and only if the vectors are linearly dependent. Therefore $$\det D_X = 0 \Leftrightarrow \text{ there exist } \lambda_1, \ldots , \lambda_n \text{ scalars so that }\\ \sum \lambda_i( X_i - E(X_i) ) = 0 \Leftrightarrow \\ \text{ there exist } \lambda_1, \ldots , \lambda_n \text{ scalars so that }\\ \sum \lambda_i X_i = \text{ constant} $$

Therefore the covariance matrix of $X$ has zero determinant if and only there exists a hyperplane that contains the values of $X$ with probability $1$.

Note that for the existence of the matrix $D_X$ you also need $M_2(||X||)<\infty$.