Covariance and Independence problem

120 Views Asked by At

The Random Variables $X$ and $Y$ can each take on only two values. Show that if $Cov(X,Y)=0$, then $X$ and $Y$ are independent.

One can see that the distributions take the form:

$P(X=x_1)=p$ and $P(X=x_2)=1-p$

$P(Y=y_1)=q$ and $P(Y=y_1)=1-q$

To show independence one must show that,

$P(X=x_i,Y=y_j)=P(X=x_i)P(Y=y_j)$ $\forall i,j$ in {$1,2$}.

As the $Cov(X,Y)=0$, then $E[XY]=E[X]E[Y]$, $(1)$

so obviously this can be used to demonstrate the independence. Do I just need to slot in for $(1)$ and investigate? And the above distributions are clearly binomial in nature where $n=1$. Any ideas on how I would continue?

1

There are 1 best solutions below

0
On

Let $X\in \{a,b\}$ and $Y\in \{A,B\}$. Let the distribution of the pair $X,Y$ be given by the following matrix $\begin{bmatrix}p_{a,A}&p_{a,B}\\ p_{b,A}&p_{b,B}\end{bmatrix},$ where $P(X=i,Y=j)=p_{i,j}.$ Then $1=p_{a,A}+p_{a,B}+p_{b,A}+p_{b,B}$, $P(X=i)=p_{i,A}+p_{i,B},$ and $P(Y=j)=p_{a,j}+p_{b,j}.$ The OP saw well that if $Cov(X,Y)=0$ then $E[XY]=E[X]E[Y].$

Now,

$$\begin{matrix} E[Y]&=&A(p_{a,A}+p_{b,A})&+&B(p_{a,B}+p_{b,B})\\ E[X]&=&a(p_{a,A}+p_{a,B})&+&b(p_{b,A}+p_{b,B})\end{matrix}.$$

Then $$E[X]E[Y]=$$

$$\begin{matrix}Aa(p_{a,A}+p_{b,A})(p_{a,A}+p_{a,B})&+&Ab(p_{a,A}+p_{b,A})(p_{b,A}+p_{b,B})&+\\Ba(p_{a,B}+p_{b,B})(p_{a,A}+p_{a,B})&+&Bb(p_{a,B}+p_{b,B})(p_{b,A}+p_{b,B})\end{matrix}.$$

Also,

$$E[XY]=Aap_{a,A}+Abp_{b,A}+Bap_{a,B}+Bbp_{b,B}.$$

Comparing the coefficients one can see the independence.