⊥∣ and ⊥∣⇒⊥(,)⇒⊥

399 Views Asked by At

I'm attempting to prove this theorem from "A Concise Course in Statistical Inference" by Larry Wasserman (Theorem 17.2), picture below:

enter image description here. He sorta just says "this is true" and I'm left pretty confused as to how.

From just basic conditional independence knowledge, I know that:

⊥∣ means P(X,Y|Z)=P(X|Z)P(Y|Z)

⊥∣ means P(X,Z|Y)=P(X|Y)P(Z|Y)

⊥(,) means P(X,(Y,Z))=P(X)P(Y,Z)

I'm honestly not entirely sure how to interpret ⊥(,). Is it just the independence of the three variables? I don't think it is because maybe it doesn't mean that Y and Z are independent of each other.

I'm pretty confused at how to approach this problem because of the "two separate things being true imply a third" thing since I'm not seeing any obvious ways to set things equal to each other and arrive at the conclusion. I've never been great at these probability proofs with Bayes' Rule and any help at all would be greatly appreciated :)

1

There are 1 best solutions below

1
On BEST ANSWER

I haven't read that book on Statistical Inference, so maybe the "inverted $\Pi$" or "$\perp\perp$" symbol doesn't mean independence. Because if it means independence, then the Claim is false.

Counter-example 1: $X=Y=Z$. Then, conditioned on $Z=c$, both $X=Y=c$, and constant random variables are independent of everything else. To be really explicit, let $Z = Bernoulli(p) \in \{0,1\}$ and lets prove $X,Y$ are independent when conditioned on $Z=1$:

  • $P(X=0|Z=1)P(Y=0|Z=1) = 0\times 0 = 0 = P(X=0,Y=0|Z=1)$
  • $P(X=0|Z=1)P(Y=1|Z=1) = 0\times 1 = 0 = P(X=0,Y=1|Z=1)$
  • $P(X=1|Z=1)P(Y=0|Z=1) = 1\times 0 = 0 = P(X=1,Y=0|Z=1)$
  • $P(X=1|Z=1)P(Y=1|Z=1) = 1\times 1 = 1 = P(X=1,Y=1|Z=1)$

Repeat this for $Z=0$ and this proves $X \perp Y | Z$. Similarly $X \perp Z | Y$, so both preconditions are satisfied. However, obviously $X$ is not independent of $(Y,Z)$:

  • $P(X=1)P(Y=Z=1) = {1\over 2}{1\over 2} = {1\over 4} \neq P(X=1 \cap Y=Z=1) = {1\over 2}$

Counter-example 2: If you think $X=Y=Z$ is too degenerate then simply add some independent "fuzz" around them. E.g. $W = Bernoulli(1/2) \in \{0,1\}$ and $X=W+A, Y=W+B, Z=W+C$ where $A,B,C$ are i.i.d. $Uniform(0,{1\over 10})$.

Once again, conditioned on $Z=z$, you can tell what $W$ is because the ranges $(0,0.1)$ and $(1,1.1)$ don't overlap. And once you know $W=w$, $X$ and $Y$ become independent. So $X \perp Y | Z$ again. And as before it is clear that $X$ and $(Y,Z)$ are dependent (through $W$).

Probable conclusion: The symbol in the text book doesn't mean independence, or there is some other precondition not shown on this scanned image.