Which matrices are covariances matrices?

482 Views Asked by At

Let $V$ be a matrix.

What conditions should we require so that we can find a random vector $X = (X_1, \dots, X_n)$ so that $V = Var(X)$?

Of course necessary conditions are:

  • All the elements on the diagonal should be positive
  • The matrix has to be symmetric
  • $v_{ij} \le \sqrt{v_{ii}v_{jj}}$ (Because of $Cov(X_i, X_j) \le \sqrt{Var(X_i) Var(X_j)})$

But I am sure these are not sufficient as I have a counterexample.

So what other properties we should require on a matrix so that it can be considered a covariance matrix?

1

There are 1 best solutions below

3
On BEST ANSWER

I think I cleared this up sufficiently.

Okay, so

1) If $V$ is not semi definite positive, then such a vector $X$ does not exists. (Since all covariances matrix are semi definite positive)

2) If $V$ is symmetric semidefinite positive, then such an $X$ exists! [0]

This implies that

$$\text{exists a random vector X: V = Cov(X)} \iff \text{V is symmetric positive semidefinite}$$

Since we know that those I listed in the question are necessary condition for $V$, we deduce that all symmetric semidefinite positive matrices have elements on the diagonal $\ge 0$ and are such that $v_{ij} \le \sqrt{v_{ii}v_{jj}}$.

These are not sufficient though for a matrix to be semidefinite positive but sufficient conditions are well known, after all.


[0] Proof

Since $V$ is symetric is possible to find an orthogonal matrix $Q$ such that $V = QDQ^T$, where $D$ is a diagonal matrix whose values are the eigenvalues of $V$. If $V$ is semipositive definite the elements of $D$ are all $\ge 0$, hence we can find $X$ such that $D = Cov(X)$ (just take all the variables independent with the specified variance)

It follows that the random vector $QX$ has covariance equal to $$Cov(QX) = QCov(X)Q^T = QDQ^T = V$$