Multivariate Gaussian: How to show $\Sigma$ is covariance matrix?

758 Views Asked by At

Let $\mathbf{X}$ be a multivariate Gaussian random variable in $\mathcal{R}^d$ with pdf $$ f(\mathbf{x}) =\frac{1}{Z} e ^ {-\frac{1}{2}(\mathbf{x}-\mathbf{b})^T \Sigma^{-1}(\mathbf{x} - \mathbf{b})} $$ for some vector $\mathbf{b} \in \mathcal{R}^d$ and $\Sigma$ some positive definite, symmetric, $d \times d$ matrix.

Show that

a) $Z = \sqrt{(2\pi)^d \det({\Sigma})}$ is the normalization constant

b) $\mathbf{b}$ is the expected value of $\mathbf{X}$

c) $\Sigma$ is the covariance of $\mathbf{X}$

I've completed (a) and (b) but I'm stuck on (c). I know that $Cov(X_i,X_j) = E(X_iX_j) - b_ib_j$ but how to show that $\Sigma_{i,j} = Cov(X_i,X_j)$ ? Help or hints greatly appreciated!

1

There are 1 best solutions below

2
On

Let $C$ be a symmetric matix such that $\Sigma = CC^T$ (such a $C$ exists, why?). Let $Z = C^{-1}(X-b)$. Show the components of $Z$ are i.i.d normal with mean 0 and variance 1.

Next use the fact that $X = CZ + b$ and the variance covariance matrix of $X$ is $E( (X-b) (X-b)^T ).$