Proof to show that zero covariance implies independence for two Bernouilli-distributed variables

141 Views Asked by At

I want to prove that if $X,Y$ are Bernoulli-distributed (with $p_1, p_2$ for parameters), and if $Cov(X,Y) = 0$ then $X$ and $Y$ are independent. My proof is the following :

$Cov(X,Y)=0 \iff E(XY) = E(X)E(Y)$

Then since XY is also Bernouilli distributed :

$P(XY =1) = E(X)E(Y)$

$P(X=1, Y=1) = P(X =1)P(Y=1)$

But now I am unsure about if I can conclude from that, or if I need to show following equalities : $P(X=0, Y=1) = P(X =0)P(Y=1)$

$P(X=1, Y=0) = P(X =1)P(Y=0)$

$P(X=0, Y=0) = P(X =0)P(Y=0)$

Thank you !

PS : sorry for the spacing, but in the preview $$ \\ $$ doesn't seem to insert a newline.

1

There are 1 best solutions below

2
On BEST ANSWER

The others follow immediately. Define $Z=1-Y\sim Bern(P(Y=0)).$ Note $Z$ is linear in $Y$ so is also uncorrelated with $X$. Then by your result, $$P(X=1,Y=0)=P(X=1,Z=1)=P(X=1)P(Z=1)=P(X=1)P(Y=0).$$

Others are similar.