Characterization of covariance matrices

328 Views Asked by At

Let $X$ be a random vector. It is well known that the covariance matrix $\Sigma$ of $X$ is a square matrix that has the following properties:

(1) It is symmetric;

(2) Its diagonal entries are nonnegative;

(3) It is positive semidefinite.

Now, does the converse also holds? In other words, suppose we have a square matrix $M$ which has properties (1) through (3). Can we say that there exists a random vector $X$ whose covariance matrix is equal to $M$?

More in general, which criteria do we know to show that some matrix $M$ is actually a covariance matrix?

1

There are 1 best solutions below

1
On BEST ANSWER

Yes, conditions (1)-(3) are suffice.

Let $M$ be some matrix that meets conditions (1)-(3), thus $M$ has an orthogonal diagonalization $P\Lambda P^{-1}=P\Lambda P^{T}$. Note that for some random vector $Y$ with uncorrelated entries and non-random matrix $A$ the covariance of $AY$ is given b $$ cov(AY)=Acov(Y)A^T=A\Lambda A^T=A \Lambda A^T. $$ So every symmetric non-negative definite matrix $M$, can be viewd as $Pcov(X)P^T=cov(PX)$ where $cov(X)=diag(\lambda_1,...,\lambda_n) $.