Does independence of each $Y, Z$ from $ X$ imply independence of $f(Y,Z)$ from $X$?

96 Views Asked by At

I'm trying to figure out a proof for the following statement.

If $X$ and $Y$ are independent, and $X$ and $Z$ are independent,

then $X$ and $f(Y,Z)$ are also independent, for any $f(\cdot, \cdot)$

Is there any counter-example against the above statement?

2

There are 2 best solutions below

0
On BEST ANSWER

Let $X$, $Y$ and $Z$ be three binary random variables such that all valuations in which an even number of them are $1$ are equiprobable (and the other valuations don't occur). Then $X$ and $Y$ are independent, and $X$ and $Z$ are independent, but $X=Y\operatorname{XOR} Z$.

0
On

I like this question. Your statement sounds very credible AND is wrong. One cannot see enough examples of statements like that!

I like Joriki's counterexample. Here is another one. I throw a die, the outcome is A. We take $X$ to be the event $A \in \{1, 2, 3\}$, $Y$ is the event that $A \in \{1, 5\}$ and $Z$ is the event that $A \in \{1, 6\}$.

Now the probability that $Y$ holds given $X$ is 1/3, which is also the probability that $Y$ holds without any knowledge of whether or not $X$ is true. So $X$ and $Y$ are independent. (You can also see it from the other side: the probability that $X$ holds is a priori 1/2. After knowing $Y$ it is still 1/2. So $X$ and $Y$ are independent.)

The reasoning that $X$ and $Z$ are independent is identical.

Now let $f(X, Y)$ is $X$ AND $Y$. Tracing back the definition we find that $f(X, Y)$ is the event $A = 1$. This is clearly not independent of $X$ intuitively, and this intuition is backed up by computation: the probability of $X$ is 1/2 without any knowledge of $f(X, Y)$ but becomes 1 once we know that $f(X, Y)$ is true.