Independent random variables are uncorrelated. In general, uncorrelated random variables need not be independent. But in specific examples they are.

461 Views Asked by At

Independent random variables are uncorrelated. In general, uncorrelated random variables need not be independent. But in specific examples they are.

Question: Let ξ and η be Bernoulli(p) and Bernoulli(r) random variables, 0 < p, r < 1. Show that if ξ and η are uncorrelated then they are independent.

My attempt: I'm not exactly sure where to begin with this. I know that if the covariance between two random variables is equal to zero, then they are said to be uncorrelated. However, I struggled to complete this.

1

There are 1 best solutions below

3
On

To show that two discrete variables are independent, it suffices to show that $P(X=a\cap Y=b)=P(X=a)P(Y=b)$ for all values $a,b$ that $X,Y$ can take. Here, $a,b$ can both only be $0$ and $1$. I'll take care of the case $a=b=1$.

The correlation between $\xi$ and $\eta$ is equal to $\frac{E[\xi \eta]-E[\xi]E[\eta]}{\sqrt{\text{Var }\xi\text{ Var }\eta}}$. If this is zero, then this implies $$ E[\xi\eta]=E[\xi]E[\eta]=p\cdot r=P(\xi=1)P(\eta=1) $$

To find $E[\xi\eta]$, note that $\xi\eta$ is always equal to $0$ or $1$, so $$ E[\xi\eta] = P(\xi\eta=1)=P(\xi=1\cap \eta=1) $$

The above proves that $$P(\xi=1\cap \eta=1)=P(\xi=1)P(\eta=1).$$