What makes a legal variance matrix?

94 Views Asked by At

Straight-forward question: in probability theory, what makes a matrix a 'variance' matrix?

The one we have worked with all 'happen' to be symmetric and positive-definite. Are there any more requirements? Where do these requirements come from? There isn't any theorem in my book that explicitly says "M is a variance matrix if and only if it satisfies blah blah blah".

1

There are 1 best solutions below

2
On

You can define a (unique) multivariate normal distribution with any given symmetric positive covariance matrix. So all symmetric and positive-semidefinite matrice are legal covariance matrices.

(Note that it need only be positive-semidefinite and not positive-definite because you can have the distribution being a point mass in some directions.)

Let $\mathbf{X}=(X_1,\dots,X_n)$ be the list of random variables that we are taking the covariance of. Covariance is defined by

$$\mathop{Cov}(X,Y)=E\left[\left(X-E\left[X\right]\right)\left(Y-E\left[Y\right]\right)\right]$$

and so $\mathop{Cov}(X_i,X_j)=\mathop{Cov}(X_j,X_i)$. So the covariance matrix is symmetric.

To show it is positive definite, let $\mathbf{x}=(x_1,\dots,x_n)$ be some (non-random) vector. Then note

$$x^T\mathop{Cov}x=\sum_{i,j}x_iE\left[\left(X_i-E\left[X_i\right]\right)\left(X_j-E\left[X_j\right]\right)\right]x_j$$ $$=\sum_{i,j}E\left[\left(x_iX_i-E\left[x_iX_i\right]\right)\left(x_jX_j-E\left[x_jX_j\right]\right)\right]=\mathop{Var}(\mathbf{x}\cdot\mathbf{X})$$

Since variances are non-negative, the covariance matrix is positive-semidefinite.