Proving $P(A|(B \cap C)) = P(B | (A \cap C)) P(A | C) / P(B | C)$ using Bayes' theorem.

176 Views Asked by At

The following equation can be proven rather uglily, provided that $P(B \cap C)$, $P(A \cap C)$ and $P(C)$ are non-zero, by expanding the conditional probabilities.

$$P(A | (B \cap C)) = \frac{P(B | (A \cap C)) P (A | C)}{P(B | C)}$$

However, it looks suspiciously similar to the Bayes' theorem:

$$P(A|B) = \frac{P(B|A) P(A)}{P(B)}$$

Can it be proven instead using Bayes' theorem instead?

Edit: Background

For some reason, some papers on robotics (especially when discussing about partially observable markov decision process) likes to call this equation the result of applying "Bayes' rule". So I was wondering if there's actually a way to use Bayes' rule to obtain this.

1

There are 1 best solutions below

0
On

You could use $$P(A | (B \cap C)) P(B | C) = P((A \cap B) | C)= P(B | (A \cap C)) P (A | C)$$ which is essentially the same as $$P(A | B) P(B ) = P(A \cap B)= P(B | A) P (A )$$ conditioned on $C$.

Alternatively, if $P(C)\gt 0$, you could use $$P(A | (B \cap C)) P(B | C) P(C)= P(A \cap B \cap C)= P(B | (A \cap C)) P (A | C)P(C).$$