Bayes' Theorem and Total Probability

323 Views Asked by At

How to prove this by induction: $C_1,C_2,...C_k$ partition the sample space $$ \begin{align*} P(A \mid B)&=\sum_i P(A\mid C_i \, B) \, P(C_i \mid B). \end{align*} $$ The base case is trivial, but I don't understand what to do with the inductive step since breaking the sum into $1$-kth element sum and $(k+1)$ element does not really help since ($1$ through $k$) sum does not have the all the elements $C_1...C_{k+1}$ and therefore I cannot use the inductive hypothesis.

2

There are 2 best solutions below

0
On BEST ANSWER

I'm neither very enthusiastic on using induction here but...

I guess that the induction is to be used on the number of parts, not on the parts themselves.

Say, define $D_i=C_i$ for $i=1 \cdots k-1$ and $D_k = C_k \cup C_{k+1}$

Then show that the last two terms of the sum $\sum_{i=1}^{k+1} P(A\mid C_i \, B) \, P(C_i \mid B)$ are equivalent to one term using $D_k$, and then you can use induction, using the hypothesis on the $k$ $\{D_i\}$ parts.

$$P(A\mid C_k \, B) \, P(C_k \mid B)+P(A\mid C_{k+1} \, B) \, P(C_{k+1} \mid B)=\\ =\frac{P(A \cap C_k \cap B)}{P( B)} + \frac{P(A \cap C_{k+1} \cap B)}{P( B)} =\\ =\frac{P(A \cap B \cap (C_k \cup C_{k+1}))}{P(B)}=P(A \mid D_k B) \, P(D_k \mid B)$$

1
On

This is not a theorem I would prove by induction. If the $C_i$ partition the sample space then $$ P(A\cap B) =P(A \cap B \cap \Omega) = P(A \cap B \cap\cup_i C_i) = \sum_iP(A\cap B\cap C_i) $$ and now divide both sides by $P(B)$. The left gives $P(A|B)$, and on the right we have terms $$ \frac{P(A\cap (B \cap C_i))}{P(B)} = \frac{P(A\cap (B \cap C_i))}{P(B\cap C_i)}\frac{P(B \cap C_i)}{P(B)} = P(A|B\cap C_i)P(C_i|B). $$

Note here that for this argument to work we must assume $P(B),P(B\cap C_i)>0$ for all $i$. (And indeed, without that assumption the theorem is false.)