Why, exceptionally, for joint normal random variables does it hold true that $0$ covariance between their components is equivalent to independence?

798 Views Asked by At

For any generic random variables $X$ and $Y$, we know that $$X\perp \!\!\! \perp Y \implies \text{Cov}(X,Y)=0\tag{1}$$ but not viceversa.
Now, let us consider two random variables $X$, $Y$, whose joint distribution $(X, Y)$ is normal. Why in this case does it hold true that:

$$ X \perp \!\!\! \perp Y \iff \text{Cov}(X,Y)=0\tag{2}$$?


Could you please explicit your reasoning, specifying why normal joint distribution of $X$ and $Y$ is essential for $(2)$ to hold?

1

There are 1 best solutions below

0
On BEST ANSWER

Let's $X$, $Y$ be jointly gaussian.

$$f_{XY}(x,y)=\frac{1}{2\pi\sigma_X\sigma_Y\sqrt{1-\rho^2}}\exp\Bigg\{-\frac{1}{2(1-\rho^2)}\Bigg[\frac{(x-\mu_X)^2}{\sigma_X^2}-2\rho\frac{(x-\mu_X)(y-\mu_Y)}{\sigma_X\sigma_Y}+\frac{(y-\mu_Y)^2}{\sigma_Y^2}\Bigg]\Bigg\}\tag{1}$$

Now, recalling that $X$ and $Y$ are independent if and only if $f_{XY}(x, y)=f_X(x)f_Y(y)$, if you set $\rho=0$ in $(1)$ you get:

$$f_{XY}(x,y)=f_X(x)f_Y(y)$$

and we are done.