How should I interpret the results of Bayes theorem in this particular problem?

65 Views Asked by At

A bag contains 5 coins. Four of them are fair and one has heads on both sides. You randomly pulled one coin from the bag and tossed it 5 times, heads turned up all five times. What is the probability that you toss next time, heads turns up. (All this time you don't know you were tossing a fair coin or not).

This question has been asked on this website before - and I understand the initial solution.

What is the probability that you toss next time, heads turns up

As a step in the given solution, we calculate P(A|B) where A is the event that the coin is fair, and B is the event that you flip 5 heads.

I am trying to solve it a different way, by setting A as the event that the next flip is heads, and B is the event that you flip 5 heads. Then, Bayes theorem states that P(A|B) = P(B|A) * P(A) / P(B). My question is specifically about P(B|A). To my understanding, this would be stated in English as the probability that you flip 5 heads given that the next flip is heads. I'm having trouble interpreting this. How can we condition on an event (next flip is heads) that happens after the 5 flips have already turned up heads?

Firstly, is my way of approaching the problem valid? Am I incorrectly applying Bayes theorem? If not, any clarification as to how I should interpret this result would be greatly appreciated.

2

There are 2 best solutions below

1
On BEST ANSWER

Your method is a correct manipulation using Baye's rule, but it is not convenient. As you note, it is not easier to understand the event that you flip 5 heads given the next is a head. As a general rule, you want to condition on things that make life simpler. The thing that makes life simpler is knowing if we have the all-heads coin, or not.

If we define events:

  • $A$ = event that we have the [A]ll-head coin

  • $F$ = event that [F]irst [F]ive flips are heads

  • $N$ = event that [N]ext flip is heads

Then we could use double-conditioning:

\begin{align} &P[N|F] \\ &= \underbrace{P[N|F, A]}_{1}P[A|F] + \underbrace{P[N|F,A^c]}_{1/2}P[A^c|F] \end{align}


If you do not like double-conditioning, then you could use the definition of conditional probability: $$P[N|F] = \frac{P[N, F]}{P[F]}$$ then separately compute the numerator and the denominator by conditioning on $A$.

This method seems to be the closest match to what you are already doing. In fact, one answer to "how do I compute $P[F|N]$" is to convert back to the form $P[F|N]=P[F,N]/P[N]$ and then separately compute numerator and denominator by conditioning on $A$.


In all methods above, we condition on $A$.

0
On

You are wrong in stating that the previous solution simply computes $P(A|B)$

$P(A^C|B)$ is also computed, and finally, using the law of total probability, we get the answer asked for, viz
P(head on $6th$ throw|first five throws have given heads)