X,Y,Z are random variables, we know (Y+Z) is also a r.v., if X is independent of Y and X is independent of Z, is it guaranteed that X is independent of (Y+Z)?
sorry that I had my probability course without much mention of measure theory, my thought is to get the p.d.f of (Y+Z) as the marginal p.d.f of the joint p.d.f of $f_{Y+Z,Y}(u,v)$ but it doesn't work.
Answers with the usage of measure theory are also fine, I would try to understand them.
If $X$ is jointly independent of $Y$ and $Z$ then it is independent of $Y+Z$. If $X$ is independent of $Y$ and independent of $Z$ then it need not be independent of $Y+Z$. Example: it is well known that we can have events $A,B,C$ which are two-by-two independent but $P(A\cap B \cap C) \neq P(A)P(B)P(C)$. Let $X= log \, (1+I_A), Y= log \, (1+I_B)$ and $Z= log \, (1+I_C)$. Then $X$ is independent of $Y$ and $X$ is independent of $Z$. Suppose $X$ is independent of $Y+Z$. Then $e^{X}\equiv 1+I_A$ is independent of $e^{Y+Z}=(1+I_B)(1+I_C)$. This implies $E(1+I_A)(1+I_B)(1+I_C)=E(1+I_A) E((1+I_B)(1+I_C))$. This leads to the contradiction $P(A\cap B \cap C) = P(A)P(B)P(C)$. For a specific example consider $(0,1)$ with Lebesgue mesure and take $A=(0,\frac 1 4)\cup (\frac 3 4 ,1)$, $B=(0,\frac 1 2) $ and $C=(\frac 1 2 ,1)$.