I have a simple question in Probability but I cannot find an easy way to show it (if it is true) from the basic definitions.
If I have $X$ a random variable which is independent from $Y_1$ and $Y_2$, does $X$ be independent of $Y_1+Y_2$ ?
I have a simple question in Probability but I cannot find an easy way to show it (if it is true) from the basic definitions.
If I have $X$ a random variable which is independent from $Y_1$ and $Y_2$, does $X$ be independent of $Y_1+Y_2$ ?
On
Counterexample.
Throw a fair coin twice.
For $i=1,2$ let $Y_i$ take value $1$ if the $i$-th toss is heads and value $0$ otherwise.
Let $X$ take value $1$ if both coins give the same result and takes value $0$ otherwise.
Then $X$ and $Y_1$ are independent and $X$ and $Y_2$ are independent.
However: $$P(X=1,Y_1+Y_2=1)=0\neq P(X=1)P(Y_1+Y_2=1)$$
If $X$ is independent of $Y_1$ and independent of $Y_2$ it need not be independent of $Y_1+Y_2$. But if $X$ is jointly independent of $Y_1$ and $Y_2$ then it is independent of $f(Y_1,Y_2)$ for any measurable function $f:\mathbb R^{2} \to \mathbb R$. In particular it is independent of $Y_1+Y_2$.
A counter-example: As is well known there exist events $A,B,C$ such that any two of them are independent but $P(A\cap B\cap C) \neq P(A)P(B)P(C)$. Take $X=I_A, Y_1=I_B$ and $Y_2=I_C$. Then $X$ is independent of $Y_1$ as well as $Y_2$ but $E[X(Y_1+Y_2)^{2}] \neq (EX) (E(Y_1+Y_2)^{2})$ as seen by a simple computation. This implies that $X$ is not independent of $(Y_1+Y_2)^{2}$. But then $X$ cannot be independent of $Y_1+Y_2$ either.