X and Y are Bernoulli. Suppose $P(X = 1, Y = 1) = P(X = 1) P(Y = 1)$. Prove that X and Y must be independent

577 Views Asked by At

Let X ∼ Bernoulli($\theta$) and Y ∼ Bernoulli($ψ$), where $0 < \theta < 1$ and $0 < ψ < 1 $. Suppose $$P(X = 1, Y = 1) = P(X = 1) P(Y = 1).$$ Prove that X and Y must be independent.

Does it mean we have to prove $$P(X = 1, Y = 1) = P(X = 1) P(Y = 1)?$$

3

There are 3 best solutions below

0
On

You will need to show:
$P(X=0,Y=0)=P(X=0)P(Y=0)$
$P(X=0,Y=1)=P(X=0)P(Y=1)$
$P(X=1,Y=0)=P(X=1)P(Y=0)$

0
On

HINT:

The definition of independence is that $P(X=x,Y=y)=P(X=x)P(Y=y)$ for all $x\in S_X$ and $y\in S_Y$. You are given that this is true for $x=y=1$, and so you have three more cases in which to prove it: $x=0,y=0$; $x=1,y=0$; and $x=0,y=1$.

0
On

By definition: $P(X=1)=\theta$ and $P(Y=1)=\psi$.

We are given: $P(X=1,Y=1) = P(X=1)P(Y=1) = \theta \psi$.

For $X$ and $Y$ to be independent, we have to show:

  • $P(X=1,Y=1) = P(X=1)P(Y=1)$, which is given
  • $P(X=1,Y=0) = P(X=1)P(Y=0) = \theta(1-\psi)$
  • $P(X=0,Y=1) = P(X=0)P(Y=1) = (1-\theta)\psi$
  • $P(X=0,Y=0) = P(X=0)P(Y=0) = (1-\theta)(1-\psi)$

Let's use Baye's rule:

$P(X=1|Y=1) = \frac{P(X=1,Y=1)}{P(Y=1)} = P(X=1) = \theta$ (given). It also implies that $P(X=0|Y=1)=1-\theta$. By the same token, $P(Y=1|X=1) = \psi$ and $P(Y=0|X=1) = 1-\psi$.

Now, we use Baye's rule a few more times and the results derived above:

$P(X=0, Y=1) = P(X=0|Y=1)P(Y=1) = (1-\theta)\psi = P(X=0)P(Y=1)$

$P(X=1, Y=0) = P(Y=0|X=1)P(X=1) = (1-\psi)\theta = P(X=1)P(Y=0)$

$P(X=0, Y=0) = 1-P(X=1,Y=1)-P(X=0,Y=1)-P(X=1,Y=0)=1-\theta\psi-(1-\theta)\psi-(1-\psi)\theta = 1-\theta-\psi+\theta\psi = (1-\theta)(1-\psi) = P(X=0)P(Y=0)$.

Thus, we have proved all the necessary conditions for concluding that $X$ and $Y$ are independent.