Conditional Independence, Decomposition

248 Views Asked by At

Is there some set of independence relations between three random variables $X$, $Y$ and $Z$ such that $P(Z \mid X, Y)$ = $P(Z \mid X) \cdot P(Z \mid Y)$? (I feel like there should be, but I can't find it).

1

There are 1 best solutions below

0
On

The relationship you are hoping to see: $$ p(z\mid x,y)=p(z\mid x) p(z\mid y)\tag1$$ isn't true for the graph $x\to z\leftarrow y$ in general. The graph $x\to z\leftarrow y$ represents a joint distribution for $(X,Y,Z)$ that can be factored into the form $$ p(x,y,z)=p(x)p(y)p(z\mid x,y).\tag2 $$ By summing (2) over $z$, we find that $X$ and $Y$ are independent. However, independence between $X$ and $Y$ doesn't imply (1).

For a counterexample, let $X$ and $Y$ be independent, taking values $0$ and $1$ with equal probability. Define $Z:=X+Y$. Then the joint distribution of $(X,Y,Z)$ satisfies the relationship (2). Plugging $x=0, y=1, z=1$, we calculate $p(z\mid x,y)=1$, $p(z\mid x)=\frac12$, and $p(z\mid y)=\frac12$, so (1) doesn't hold.