proof that covariance matrix of multivariate normal distribution is positive definite

1.1k Views Asked by At

I want to know the proof that covariance matrix of multivariate normal distribution is positive definite in order to have a pdf.

To have a pdf, if x is a random vector of size n,

1) f(x) >=0

2) $$\int_{-\infty}^{\infty} f(x) dx$$ = 1

how can this proven by showing these two properties of pdf?

2

There are 2 best solutions below

1
On

You don't need the density function to prove this. $\sum_{i,j} a_ia_j cov(X_i,X_j)=\sum_{i,j} a_ia_j E(X_i-EX_i)(X_j-EX_j)=(E(\sum_i a_i(X_i-EX_i))^{2} \geq 0$ for all $(a_i)$.

0
On

Any covariance matrix is symmetric, positive semi-definite.

Let $X=(X_1,...,X_n)^T$ be a multivariate random variable. For simplicity, let's assume it's centered (that is $E(X_i)=0$). The covariance matrix is defined by its coefficients: $$C_{ij}=E(X_iX_j)$$ In other words, the covariance matrix is given by $C=E(XX^T)$. Therefore, for any vector $u\in\mathbb R^n$, $$u^TCu=u^TE(XX^T)u=E(u^TXX^Tu)=E(\langle u, X\rangle^2)\geq 0$$ And the equality to $0$ is achieved iff there exists $u\in \mathbb R^n$ such that $\langle u, X\rangle=0$ almost surely. That is, iff random variable $X$ doesn't span the full $\mathbb R^n$ space, but only a strict subspace.

This can't happen for a normal distribution, therefore the matrix positive semi-definite.