Derivation of an Instance of Bayes' Theorem

175 Views Asked by At

In the William Lane Craig VS Bart Ehrman debate, Dr. Craig presents the formula $Pr(R\ |\ (B\ \cap\ E)) = \frac{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cap\ R))}{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cap\ R)) + Pr(R^c\ |\ B)\ Pr(E\ |\ (B\ \cap\ R^c))}$. (notation slightly modified)

The forms of Bayes' Theorem of which I'm familiar are the probability form, $Pr(X\ |\ Y) = \frac{Pr(X)\ Pr(Y\ |\ X)}{Pr(Y)}$, and the odds form, $\frac{Pr(X\ |\ Y)}{Pr(X^c\ |\ Y)} = \frac{Pr(X)}{Pr(Y)} \frac{Pr(Y\ |\ X)}{Pr(X^c\ |\ Y)}$.

The most progress I can make using my understanding of Bayes' Theorem is $Pr(R\ |\ (B\ \cap\ E)) = \frac{Pr(R)\ Pr((B\ \cap\ E)\ |\ R)}{Pr(B\ \cap\ E)} = \frac{Pr(R)\ Pr((B\ \cap\ E)\ |\ R)}{Pr(B)\ Pr(E\ |\ B)}$. How can I get from here to the formula from the debate, $\frac{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cap\ R))}{Pr(R\ |\ B)\ Pr(E\ |\ (B\ \cap\ R)) + Pr(R^c\ |\ B)\ Pr(E\ |\ (B\ \cap\ R^c))}$?

2

There are 2 best solutions below

0
On BEST ANSWER

Recall the definition of conditional probability:

$$P(X | Y) = \frac{P(X \cap Y)}{P(Y)} $$

A property of conditional probability is that if we take a theorem and replace the probabilities with probabilities conditional on some other event $B$, then the relationship still holds:

$$P(X | Y \cap B) = \frac{P(X \cap Y | B)}{P(Y | B)}$$

Of note here is that the conditional replacement of $P(X | Y)$ is $P(X | Y \cap B)$, since a probability being conditional on both $Y$ and $B$ is the same as being conditional on their intersection.

We can apply the same principle to Bayes' Theorem (this following immediately from the above):

$$P(X | Y \cap B) = \frac{P(Y | X \cap B)P(X | B)}{P(Y | B)}$$

Replace the notation: $$P(R | B \cap E) = \frac{P(E | B \cap R)P(R | B)}{P(E | B)}$$

At this point we expand the denominator using the Total Probability formula. Recall that for taking total probability over a single event and its complement the formula looks like this:

$$P(X) = P(Y)P(X | Y) + P(Y^c)P(X | Y^c)$$

Similarly to the above, the total probability formula still holds if all the probabilities are replaced by conditional probabilities on $B$:

$$P(E | B) = P(R | B)P(E | R \cap B) + P(R^c | B)P(E | R^c \cap B) $$

On substituting this into the denominator the result follows.

0
On

It is just an application of the Law for Total Probability under the condition of $B$.

$$\begin{align}\mathsf P(R\mid B\cap E) &=\dfrac{\mathsf P(B\cap E\cap R)}{\mathsf P(B\cap E)}\\[2ex]&=\dfrac{\mathsf P(B\cap E\cap R)}{\mathsf P(B\cap E\cap R)+\mathsf P(B\cap E\cap R^\complement)}\\[2ex] &=\dfrac{\mathsf P(B)~\mathsf P(R\mid B)~\mathsf P(E\mid B\cap R)}{\mathsf P(B)~\mathsf P(R\mid B)~\mathsf P(E\mid B\cap R)+\mathsf P(B)~\mathsf P(R^\complement\mid B)~\mathsf P(E\mid B\cap R^\complement)}\\[2ex]&=\dfrac{\mathsf P(R\mid B)~\mathsf P(E\mid B\cap R)}{\mathsf P(R\mid B)~\mathsf P(E\mid B\cap R)+\mathsf P(R^\complement \mid B)~\mathsf P(E\mid B\cap R^\complement)}\end{align}$$