Posterior probability of conditionally independent variables

603 Views Asked by At

I sort of understand this logically, but I'm having a hard time getting the math to line up.

Say we have two variables that are conditionally independent given a third variable

$$P(A,B\mid C) = P(A\mid C)\cdot P(B\mid C)$$

Then paper says that the posterior can be factored as

$$P(C\mid A,B) = \frac{P(B\mid C) P(C\mid A)}{\sum_C P(B\mid C) P(C\mid A)} $$

(the denominator is just the numerator summed over all $C$, just normalization)

Normally, using Bayes' rule the posterior would be

\begin{align} P(C\mid A,B) & = \frac{P(A,B\mid C) P(C)}{P(A,B)} \\[10pt] & = \frac{P(A\mid C)P(B\mid C)P(C)}{P(A,B)} \end{align}

I'm having trouble getting from this version to the factored version above.

I understand intuitively why this works. It's kind of like using the posterior as the new prior.

1

There are 1 best solutions below

1
On BEST ANSWER

$1.$ $P(A,C) = P(A|C)P(C) = P(C|A)P(A)$

$2.$ $P(A,B) = \sum_C P(C) P(A,B|C) $

$~~~~~~~~~~~~~~~~~~ \overset{(a)}{=}\sum_C~ P(C)~ P(A|C)~ P(B|C)$

$~~~~~~~~~~~~~~~~~~ = P(A)~ \sum_C~ P(C|A) ~P(B|C)$

$1.$ is Bayes' law, $2.$ is the total probability law, the equality $(a)$ comes from your assumption that $A$ and $B$ are conditionally independent give $C$, and the last equality is due to $1.$

Thus, \begin{align} P(C|AB) &= \frac{P(A,B,C)}{P(A,B)}\\ &\overset{2.}{=} \frac{P(B|C) P(A|C) P(C)}{P(A)~ \sum_C~ P(C|A) ~P(B|A)}\\ &\overset{1.}{=} \frac{P(B|C) P(C|A) P(A)}{P(A)~ \sum_C~ P(C|A) ~P(B|C)} \\ &= \frac{P(B|C) P(C|A)}{\sum_C~ P(C|A) ~P(B|C)} \end{align}