Independence and conditional independence equivalence in probability theory

382 Views Asked by At

I was recently given this in my probability theory class on the different meanings of independence:

Let X,Y and Z be three random variables. We are asked the following questions, to prove or disprove:

  1. If X and Y are independent, then is it true that X and Y are independent given Z?

  2. If X and Y are independent given Z, is it true that X and Y are independent?

I still have not much intuition into this field of math and really do not know which is true and which is false and I don't even know where to start so I certainly wold appreciate the help on this. I don't know how to attack this problem or how to use the different theorems relating these things to other concepts so I am truly stuck, and I certainly would appreciate explicit proofs or counterexamples as I am simply a novice in probability theory. Thanks all helpers.

2

There are 2 best solutions below

0
On BEST ANSWER
  1. You need the say something about $Z$. If for example $Z=X+Y$ then it is obvious that $X$ and $Y$ are not independent given $Z$.
  2. Let $V$ and $W$ be another two random variables independent of $X,Y,Z$ and assume, that $X=Z+V$ and $Y=Z+W$. Can you conclude that now $X,Y$ are independent given $Z$ but $X$ and $Y$ are not independent?
2
On

Toss a fair coin twice. Let $X=1$ if the first toss is a head, and $0$ otherwise. Define $Y$ similarly for the second toss. Let $Z=1$ if the two tosses give the same result, and $0$ otherwise. Then $X$ and $Y$ are independent, but given that $Z=1$ they are not.

Note that in this example, we also have $X$ and $Z$ independent, and also $Y$ and $Z$.