Conditional Probability Identity

107 Views Asked by At

Let $X, Y, Z$ be random variables. Is it true that

$\sum_z P(X=x| Y=y , Z=z)P(Z=z) = P(X=x|Y=y)$ if and only if $Y$ and $Z$ are independent?

It can be easily shown using computation that if $Y$ and $Z$ are independent, the identity holds. I was wondering if the other direction holds.

1

There are 1 best solutions below

0
On BEST ANSWER

Generally, the Law of Total Probability says that, for any $x$ and $y$, that: $$\begin{split}\mathsf P(X=x\mid Y=y) &=\sum_z \mathsf P(X=x, Z=z\mid Y=y) \\ &=\sum_z \mathsf P(X=x\mid Y=y,Z=z)\mathsf P(Z=z\mid Y=y) \end{split}$$

If $Y$ and $Z$ are independent we have $\forall y\forall z.\mathsf P(Z=z\mid Y=y)=\mathsf P(Z=z)$, by the very definition of independence.   So then do we have, for any $x$ and $y$, that:

$$\begin{split} \mathsf P(X=x\mid Y=y) & =\sum_z \mathsf P(X=x\mid Y=y,Z=z)\mathsf P(Z=z) \end{split}$$


However, it may be possible for the two series to have the same evaluation although some of the indexed terms are not equal.   $2+3=1+4$ after all.   It would be rarer for this to be so for any $x$ and $y$, but it is still possible.