Independence between 3 R.V.

145 Views Asked by At

Let us assume we have 3 R.V. $X_1,X_2,Y$, where $Y$ is independent of $X_1$ and $Y$ is independent of $X_2$, as well. $X_1$ and $X_2$ are not independent. Then we know that it may be the case that $Y$ is not independent $X_1+X_2$. Now, I have three doubts:

1) If we said that $Y$ was independent of $(X_1,X_2)$, then $Y$ would be independent of $X_1+X_2$. Right?

2) Is there any other simple example in which $Y$ is independent of $X_1$ and $Y$ is independent of $X_2$, but $Y$ is not independent $X_1+X_2$? Beyond the one in

https://stats.stackexchange.com/questions/222242/is-the-sum-of-two-variables-independent-of-a-third-variable-if-they-are-so-on-t

3) What would change if we added the information that the 3 r.v. are jointly gaussian? Would it be true that if $Y$ is independent of $X_1$ and $Y$ is independent of $X_2$ then $Y$ is independent of$X_1+X_2$?

Thank you

1

There are 1 best solutions below

2
On

1). Yes. This is actually argued in the first answer in the link you gave.

2). If you consider addition in say $\mathbb{F}_2$ (i.e., modulo 2), there is the very simple example of $X_1,X_2$ independent uniform on $\mathbb{F}_2$ and $Y=X_1+X_2$.

What's below is ignoring the keyword "jointly", so I'll delete.

3). Without loss of generality, we can assume $X_1,X_2,Y$ are all zero-mean and have unit variance, since subtracting a constant to all will not change the dependency structure nor Gaussianity, and renormalization won't either. Since $(X_1,X_2,Y)$ are jointly Gaussian and $X_1,Y$ (resp. $X_2,Y$) are independent, the covariance matrix is $$\Sigma = \begin{pmatrix} 1&\alpha & 0\\ \alpha & 1& 0\\ 0& 0& 1\\ \end{pmatrix}$$ where $\alpha = \mathbb{E}[X_1X_2] = \operatorname{Cov}(X_1,X_2)$.

Then $(X_1+X_2,Y)$ is jointly Gaussian (it is a linear transformation of $(X_1,X_2,Y)$): $$ \begin{pmatrix} X_1+X_2\\Y \end{pmatrix} = \underbrace{\begin{pmatrix} 1&1&0\\ 0&0&1 \end{pmatrix}}_{A}\begin{pmatrix} X_1\\X_2\\Y \end{pmatrix} $$ and its covariance matrix is $$\Sigma' = A\Sigma A^T = \begin{pmatrix} 2(1+\alpha) & 0\\ 0&1 \end{pmatrix}$$ showing that $X_1+X_2$ and $Y$ are independent (being jointly Gaussian and uncorrelated).

4). If you remove the "jointly" from "jointly gaussian", however, this is not true. Consider $X_1,Y$ independent standard Gaussians $N(0,1)$, set $R = \operatorname{sign}(Y)$ and $X_2 = RX_1$. Then

  • $X_1$ and $Y$ are independent (by definition)
  • $X_2$ and $Y$ are independent (check it)
  • $X_1+X_2=(1+R)X_2$ and $Y = R|Y|$ are not independent: $Y$ is negative iff $X_1+X_2=0$, for instance.