How to derive conditional probability with multiple variables using Bayes' theorem

845 Views Asked by At

The formula below is from an article that I read from my work:

$$\mathbb{P} (c_i | x,y) = \frac{\mathbb{P}(y|c_i)\mathbb{P}(c_i|x)}{\sum_{i=1}^{n}\mathbb{P}(y|c_i)\mathbb{P}(c_i|x)}.$$

The author said that he used Bayes' theorem to get this, but I have no idea why this is true!

Can someone please clarify how does the first expression is equal to the second one?

Thank you.

2

There are 2 best solutions below

2
On

Not a rigorous proof, but I assume the deal is this:

First, note that Bayes' formula can be written as $$ P(A|B) = \frac{P(A \cap B)}{P(B)} = \frac{P(B|A)P(A)}{P(B)} $$ This explains the reverse conditional probability you see at the denumerator.

Edit: forgot mentioning that by the law of total probability, in the multivariable $P(B) = \sum_j P(B|A_j) P(A_j)$.

0
On

Fortunately I found the answer:

enter image description here