Gaussian mixture: zero correlation implies independence?

368 Views Asked by At

Consider two random vectors $X\equiv(X_1, X_2),Y\equiv(Y_1, Y_2)$ distributed as below

1) $X\sim N(\begin{pmatrix} \mu_{X,1}\\ \mu_{X,2}\\ \end{pmatrix}, \begin{pmatrix} v_{X,1} & 0\\ 0 & v_{X,2} \end{pmatrix})$

2) $Y\sim N(\begin{pmatrix} \mu_{Y,1}\\ \mu_{Y,2}\\ \end{pmatrix}, \begin{pmatrix} v_{Y,1} & 0\\ 0 & v_{Y,2} \end{pmatrix})$

Consider now the random vector $W\equiv(W_1, W_2)$ whose probability distribution is obtained by mixing $X,Y$ with equal weights $1/2$, i.e. $$ f_W=\frac{1}{2}f_X+ \frac{1}{2}f_Y $$ where $f$ denotes the pdf.

Suppose also that $\mu_{X,1}=\mu_{Y,1}$ which implies that $corr(W_1, W_2)=0$ as shown here.

Question: does $corr(W_1, W_2)=0$ implies $W_1$ independent of $W_2$?


EDIT following suggestions below: Let $M_{W_1, W_2}$ denote the moment generating function of $W_1, W_2$. By construction $$ M_{W_1, W_2}(s,t)=\frac{1}{2}\exp(s\mu_{X,1}+t\mu_{X,2}+s^2v_{X,1}+t^2v_{X,2})+\frac{1}{2}\exp(s\mu_{Y,1}+t\mu_{Y,2}+s^2v_{Y,1}+t^2v_{Y,2}) $$

Proposition: $M_{W_1, W_2}(s,t)=M_{W_1, W_2}(s,0)\times M_{W_1, W_2}(0,t)$ if and only if $W_1$ independent of $W_2$

In my case $$ M_{W_1, W_2}(s,0)\times M_{W_1, W_2}(0,t)=\frac{1}{4}\exp(s\mu_{X,1}+t\mu_{X,2}+s^2v_{X,1}+t^2v_{X,2}) $$ $$ +\frac{1}{4}\exp(s\mu_{Y,1}+t\mu_{Y,2}+s^2v_{Y,1}+t^2v_{Y,2}) $$ $$ +\frac{1}{4}\exp(s\mu_{X,1}+t\mu_{Y,2}+s^2v_{X,1}+t^2v_{Y,2}) $$ $$ +\frac{1}{4}\exp(s\mu_{Y,1}+t\mu_{X,2}+s^2v_{Y,1}+t^2v_{X,2}) $$ Notice that $$ M_{W_1, W_2}(s,0)\times M_{W_1, W_2}(0,t)=M_{W_1, W_2}(s,t) $$ if and only if

$\mu_{X,1}=\mu_{Y,1}$ and $v_{X,1}=v_{Y,1}$

or

$\mu_{X,2}=\mu_{Y,2}$ and $v_{X,2}=v_{Y,2}$

Therefore, by the proposition above, $W_1 \perp W_2$ if and only if

$\mu_{X,1}=\mu_{Y,1}$ and $v_{X,1}=v_{Y,1}$

or

$\mu_{X,2}=\mu_{Y,2}$ and $v_{X,2}=v_{Y,2}$.

Could you tell me whether these derivations are correct and, if wrong, where are the mistakes?

1

There are 1 best solutions below

0
On BEST ANSWER

No, not in general. If $W_1$ and $W_2$ are independent the value of $E( W_1^2 \times W_2)$ would be $EW_1^2\times EW_2$; and we can compute all three expectations in terms of the problem parameters. Let's make our life a bit easier by taking $\mu_{X,1}=\mu_{Y,1}=0.$

First, $E W_1^2 W_2 = (E X_1^2 X_2 + EY_1^2 Y_2)/2 = (\nu_{X,1} \mu_{X,2} + \nu_{Y,1}\mu_{Y,2})/2$. Second, $EW_1^2 = (\nu_{X,1}+\nu_{Y,1})/2.$ Finally, $EW_2 = (\mu_{X,2}+\mu_{Y,2})/2$.

Independence would thus imply $$ \frac{\nu_{X,1} \mu_{X_2} + \nu_{Y,1}\mu_{Y,2}}2 = \frac{\nu_{X,1}+\nu_{Y,1}} 2 \times \frac{\mu_{X,2}+\mu_{Y,2}}2.$$ If we pick $\nu_{X,1} = 10$ and $\nu_{Y,1}=1$, and pick $\mu_{X,2} = -\mu_{Y,2} = 1$, say, independence would imply $$\frac{10-1} 2 = \frac{ 10+1} 2 \times \frac { 1 -1 } 2 = 0.$$

Moral: mixing and independence don't play together nicely. Mixtures of product measures are typically not product measures; the current case is no exception.