Replacing Intersection With Union in Bayes' Theorem

679 Views Asked by At

In this question, I asked about deriving the formula $$Pr(R\ |\ (B\ \cap\ E)) = \frac{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cap\ R))}{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cap\ R)) + Pr(R^c\ |\ B)\ Pr(E\ |\ (B\ \cap\ R^c))}.$$ However, I don't think I got the operator on the LHS correct; it only makes sense to consider the probability of $R$ given the union of $B$ and $E$, not the intersection. Is it valid to correct this by flipping the intersections on both sides to unions (and vice-versa, if there were any unions), yielding $$Pr(R\ |\ (B\ \cup\ E)) = \frac{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cup\ R))}{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cup\ R)) + Pr(R^c\ |\ B)\ Pr(E\ |\ (B\ \cup\ R^c))},$$ or would the entire formula need to be derived from scratch. If the latter, how could this be achieved, since we no longer have the conditional probability rule to start from?

1

There are 1 best solutions below

3
On BEST ANSWER

Here is a counter-example for your second equation, where you replace $\cap$s with $\cup$s.

Suppose that $Pr(R \cup E) = 1/2$, with $Pr(R \cap E) = 1/6$, and $Pr(R) = Pr(E) = 1/3$. And suppose that $B = (R \cup E)^c$, so $Pr(B) = 1/2$, with $Pr(B \cap R) = Pr(B \cap E) = 0$.

Calculating all the terms in the second formula, we have:

  • $Pr(R \mid B \cup E) = (1/6)/(5/6) = 1/5$,
  • $Pr(R \mid B) = 0$,
  • $Pr(E \mid B \cup R) = (1/6)/(5/6) = 1/5$,
  • $Pr(R^c \mid B) = (1/2)/(2/3) = 3/4$, and
  • $Pr(E \mid B \cup R^c) = (1/6)/(2/3) = 1/4$.

Putting this all together, the conjectured second formula gives $$1/5 = \frac{0 \times 1/5}{(0 \times 1/5) + (3/4 \times 1/4)} = 0,$$ so it is not correct.

Given my understanding of the context of the original question, the original operator on the LHS, $Pr(R \mid B \cap E)$, is correct. You want the probability of $R$ (resurrection) given $B$ (background), and $E$ (evidence). Why would you want to condition on $B$ or $E$? Only knowing ''at least one of $B$ or $E$'' will give you less information about $R$ than knowing ''both $B$ and $E$''. In another context, if $R$ is the event of rolling a 6 on a die, $B$ was the event that the roll is bigger than 4, and $E$ is the event that the roll is even. If you know that both $B$ and $E$ are true, then you have more information about $R$ by conditioning on $B \cap E$ than conditioning on $B \cup E$.