Suppose there are 3 random variables $X$, $Y$ and $Z$ such that $X$ is independent of both $Y$ and $Z$. This means that
$P(X|Z) = P(X|Y) = P(X)$.
Is it then the case that $X$ is independent of the joint probability distribution $P(Y, Z)$ such that $P(X | Y,Z) = P(X)$?
My intuition says that this should be true but I am not sure how to go about proving it to myself.
Thanks!
It is actually (perhaps surprisingly) false. To see it, let Y and Z be iid Bernoulli(1/2), and X = Y XOR Z. (In more English; Y is 1 with probability (w.p.) 1/2, and 0 w.p. 1/2, and the same for Z. Now, if Y and Z are the same, X is 0, and if they are different, X is 1).
Now, you can check that X is independent from Y and Z by definition (just do it case-by-case), but it is certainly not independent from Y and Z when they are given together!