Find Correlation and dependency of $X+Y$ and $|X-Y|$ when $X,Y$ are Bernoulli trials

55 Views Asked by At

Let X,Y be Bernoulli trials with p=0.5 and let $X,Y$ be independent. We define:

Z = X+Y

W = |X-Y|

Are Z, W independent? Do they have correlation?

I believe Z ~ Binomial B(2, 0.5) and W ~ Bernoulli(0.5). Is this correct? How can I prove Correlation and dependency?

2

There are 2 best solutions below

0
On

Are $Z,W$ independent?

Observe that: $$W=0\iff Z\text{ is even}$$

Because $Z$ has also positive probability to be odd this already indicates that $W$ and $Z$ are not independent.

Do they have correlation?

Yes, but the actual question is probably: "do they have a correlation that differs from $0$?"

To find that out it is enough to check whether $\mathsf{Cov}(Z,W)$ takes value $0$.

Covariance is bilinear so that: $$\mathsf{Cov}(Z,W)=\mathsf{Cov}(X+Y,W)=\mathsf{Cov}(X,W)+\mathsf{Cov}(Y,W)\tag1$$ Now observe that: $$|X-Y|=W=|Y-X|$$ from which we can conclude that the joint distributions of $(X,W)$ and $(Y,W)$ are the same and consequently:$$\mathsf{Cov}(X,W)=\mathsf{Cov}(Y,W)\tag2$$Combining $(1)$ and $(2)$ we find that: $$\mathsf{Cov}(Z,W)=0\iff\mathsf{Cov}(X,W)=0\tag3$$

It remains now to check whether the RHS of $(3)$ is true or not.

I leave that to you.

0
On

$W$ and $Z$ are not independent because $$ \mathbb P(Z=0,W=1) = 0 $$ whereas $$ \mathbb P(Z=0)\mathbb P(W=1) = 2p(1-p)^3>0. $$ Moreover, \begin{align} \mathbb E[ZW] &= \mathbb E[(X+Y)|X-Y|]\\ &= \sum_{i=0}^1\sum_{j=0}^1 (i+j)|i-j|\mathbb P(X=i)\mathbb P(Y=j)\\ &= \mathbb P(X=1)\mathbb P(Y=0) + \mathbb P(X=0)\mathbb P(Y=1) = 2p(1-p), \end{align} and \begin{align} \mathbb E[Z] &= \mathbb E[X+Y] = \mathbb E[X] + \mathbb E[Y] = 2\mathbb E[X] = 2p\\ \mathbb E[W] &= \mathbb E[|X-Y|] = 0\cdot(p^2+(1-p)^2) + 1\cdot(2p(1-p)) = 2p(1-p), \end{align} so \begin{align} \operatorname{Cov}(Z,W) &= \mathbb E[ZW] - \mathbb E[Z]\mathbb E[W]\\ &= 2p(1-p) - 2p\cdot2p(1-p)\\ &= 2 p (1-p)(1-2 p). \end{align} To compute the correlation of $Z$ and $W$ we also need the variances. We have \begin{align} \mathbb E[Z^2] &= \sum_{k=0}^2 k^2\cdot\mathbb P(Z=k)\\ &= 0^2\cdot(1-p)^2 + 1^2\cdot2p(1-p) + 2^2\cdot p^2\\ &= 2p(1+p), \end{align} and so \begin{align} \operatorname{Var}(Z) &= \mathbb E[Z^2] - \mathbb E[Z]^2\\ &= 2p(1+p) - (2p)^2 = 2p(1-p). \end{align} Similarly, \begin{align} \mathbb E[W^2] &= \sum_{k=0}^1 k^2\cdot\mathbb P(W=k)\\ &= 1^2\cdot\mathbb P(W=1)\\ &= 2p(1-p), \end{align} and so \begin{align} \operatorname{Var}(W) &= \mathbb E[W^2] - \mathbb E[W]^2\\ &= 2p(1-p) -(2p(1-p))^2\\ &= 2p(1-3p). \end{align} The correlation of $Z$ and $W$ is then given by \begin{align} \rho(Z,W) &= \frac{\operatorname{Cov}(Z,W)}{\sqrt{\operatorname{Var}(Z)}\sqrt{\operatorname{Var}(W)}}\\ &= \frac{2 p (1-p)(1-2 p)}{\sqrt{2p(1-p)}\sqrt{2p(1-3p)}}\\ &= \frac{(1-2 p) (1-p)}{\sqrt{(1-3 p) (1-p)}}. \end{align} Note that when $p=\frac12$, $\operatorname{Cov}(Z,W)=\rho(Z,W)=0$.