Intuition behind a particular formulation of Bayes's Theorem : $\dfrac{P(A\mid B)}{P(A)} = \dfrac{P(B\mid A)}{P(B)}$?

91 Views Asked by At

Bayes's theorem states $P(A\mid B) = \dfrac{P(B\mid A)\cdot P(A)}{P(B)}$. The intuition behind this is simple: if $B$ is true, then the probability that $A$ is true is the number of cases where $A$ is true out of all cases where $B$ is true.

Now, here is another formulation of the rule, just rearranging fractions: $\dfrac{P(A\mid B)}{P(A)} = \dfrac{P(B\mid A)}{P(B)}$. To me, what this says is that "if upon learning $B$ is true, we think $A$ is $x$ times more likely to be true than we previously thought, then upon learning $A$, $B$ is $x$ times more likely to be true than we previously thought." But this sentence does not seem similarly obvious to me. Is there a natural interpretation of $\dfrac{P(A\mid B)}{P(A)} = \dfrac{P(B\mid A)}{P(B)}$?

2

There are 2 best solutions below

0
On

I agree that this is less obvious, but you can go some way towards making intuitive sense of it by noting that it's obvious in three important cases:

If $A$ and $B$ are identical, it's obviously true by symmetry.

If $A$ and $B$ are independent, it's obviously true because both ratios must then be $1$.

If $A$ and $B$ are mutually exclusive, it's obviously true because both ratios must then be $0$.

Given this, it would be surprising if such simple ratios would manage to coincide at three different points but not in general.

0
On

You may be confusing yourself because you are skipping a step. Consider the definition of conditional probability. $$P(A|B)=\frac{P(A\cap B)}{P(B)}\implies P(A\cap B)=P(A|B)P(B)$$

The intersection of two sets is commutative, i.e. $$P(A\cap B) = P(B\cap A)=P(B|A)P(A)$$

Therefore, $P(A|B)P(B)=P(B|A)P(A)$