Bayes Theorem with multiple random variables

3.3k Views Asked by At

I came across this expression in the Intro to Probability book I am studying:

$P(A,B|C)=\frac{P(C)P(B|C)P(A|B,C)}{P(C)}$

Could anyone please explain how is this obtained. From a simple application of Bayes Rule, shouldn't it be:

$P(A,B|C)=\frac{P(C|A,B)P(A,B)}{P(C)}$ where $P(A,B) = P(A|B)P(B)$ ?

2

There are 2 best solutions below

8
On BEST ANSWER

By the definition of conditional probability: $P(Y \mid X)=P(X,Y)/P(X) \implies P(X,Y)=P(Y\mid X) P(X)$. Hence (think of $A,B$ as a single multivariate variable):

$$P(A,B\mid C)=\frac{P(A,B,C)}{P(C)}$$

Then, by the chain rule of probability, $P(A,B,C)=P(A\mid B,C)P(B,C)$ and $P(B,C)=P(B\mid C)P(C)$

7
On

The extension of Baye's rule is such that $$P(A,B|C)=\frac{\color{blue}{P(A)}\color{blue}{P(B|A)}P(C|A,B)}{\color{blue}{P(B)}P(C|B)}\tag{1}$$

The formula can be seen as an extension of $$P(A|B)=\frac{\color{blue}{P(A)P(B|A)}}{\color{blue}{P(B)}}\tag{2}$$ This is not a derivation of course; I'm just trying to show you that the $\color{blue}{\mathrm{blue}}$ terms are common to both $(1)$ and $(2)$ and if you reverse the conditioning for the extra variable ($C$) by placing the conditional probabilities in both numerator and denominator you arrive at $(1)$.