Marginalizing over a joint distribution

369 Views Asked by At

I am trying to understand and answer the following:

For the probability distribution $P(A, B, C, D)$ with the factorization $P(A, B, C, D) = P(A)P(B)P(C|A, B)P(D|C)$, show that the following conditional independence assumptions hold:

(i) $A ⊥ B|∅$

(ii) $A ⊥ D|C$

For the item (i), we want to show that $P(A,B)=P(A)P(B)$. As far as I see, I can do the following:

$P(A,B)=\sum\limits_{\forall C, D} P(A)P(B)P(C│A,B)P(D│C) = P(A)P(B)\sum\limits_{\forall C, D}P(C│A,B)P(D│C)$

But I am not sure if I can just drop the terms $P(C│A,B)$ and $P(D│C)$. I thought they could be dropped since both are distributions over the values $A,B$ and $C$, respectively. And, therefore, they sum up to 1.

I am not confident about this answer, am I missing something...? Or this is the right way to go?

1

There are 1 best solutions below

1
On BEST ANSWER

Indeed, by rearranging the double sum, and using the Law of Total Probability: $$\begin{align}\sum_{\forall C,D}\mathsf P(C\mid A,B)\,\mathsf P(D\mid C)&=\sum_{\forall C} \Bigl(\mathsf P(C\mid A,B)\sum_{\forall D} \mathsf P(D\mid C)\Bigr)\\[1ex] &= \sum_{\forall C}\mathsf P(C\mid A,B)\\[1ex] &= 1\end{align}$$