Covariance of a mixture of Gaussians

146 Views Asked by At

I have seen this question asked, but in a strange way that I do not think is equivalent. If someone can show that the formulations are identical, I would be grateful.

Suppose with probability $p$, one draws points from a Gaussian distribution $N(0,\hat\Sigma_1)$, and with probability $1-p$ from a distribution $N(0,\hat\Sigma_2)$, where $\hat\Sigma_1$ and $\hat\Sigma_2$ are both $p,p$ matrices. After $n$ such draws, one can compute the covariance matrix of your points $\Sigma$.

Can we have an explicit formula for $\Sigma$?

1

There are 1 best solutions below

4
On BEST ANSWER

The distribution of the variable is $$ X = 1_{U<p} X_1 + 1_{U\ge p} X_2 \sim 1_{U<p} N(0, \hat\Sigma_1) + 1_{U\ge p} N(0, \hat\Sigma_2) $$ (with $U\sim U[0,1]$ is independent of the rest) which as expected value $0$ and second moment \begin{align} E[X_iX_j] &= E[(1_{U<p} X_{i,1} + 1_{U\ge p} X_{i,2})(1_{U<p} X_{j,1} + 1_{U\ge p} X_{j,2})] \\&= E[1_{U<p} X_{i,1} X_{j,1}] + E[1_{U\ge p} X_{j,2} X_{j,2}] \end{align}

now if you expand everything and use the independence of $U$ and $X_{1,2}$ you should get the matrix $$ p\hat\Sigma_1 + (1-p)\hat\Sigma_2 $$