What exactly is the difference between the Theorem of total probability and Baye's theorem?

56 Views Asked by At

The theory of total probability for dependent events states that $ P(A) = \sum\limits_{i=1}^nP(A|E_i)P(E_i)$ and $P(A|E_i) = \frac{P(A \cap E_i)}{P(E_i)}$

which in my eyes is the same as saying $ P(E_i|A)= \frac{P(E_i)P(A|E_i)}{\sum\limits_{i=1}^nP(E_i)p(A|E_i)}$

mathematically they may not look the same, bt aren't they describing the same thing? Why are they different?

1

There are 1 best solutions below

0
On

Suppose events $E_1, E_2, \ldots$ partition the sample space; then the law of total probability (in one form) states $$\Pr[A] = \sum_i \Pr[A \mid E_i]\Pr[E_i]. \tag{1}$$

The definition of conditional probability is $$\Pr[A \mid E_i] = \frac{\Pr[A \cap E_i]}{\Pr[E_i]}, \tag{2}$$ whenever $\Pr[E_i] > 0$.

Bayes' rule/theorem is simply the application of the above definition $(2)$, twice:

$$\Pr[E_i \mid A]\Pr[A] = \Pr[A \cap E_i] = \Pr[A \mid E_i]\Pr[E_i],$$ therefore

$$\Pr[E_i \mid A] = \frac{\Pr[A \mid E_i]\Pr[E_i]}{\Pr[A]}, \tag{3}$$

whenever $\Pr[A] > 0$.

When applying Equation $(3)$, we commonly use $(1)$ to evaluate the denominator of the right-hand side, but they are distinct theorems.