Why are odds of a coin landing heads $50\%$ after $'n'$ consecutive heads

461 Views Asked by At

I'm trying to understand how the odds of flipping a fair coin $4$ times in a row and landing heads each time is $\frac{1}{2^4}=\frac{1}{16}=6.25\%$;

But at the same time if I've just flipped the coin heads $3$ times my odds of it landing heads a fourth time are $50\%.$ These numbers seem to contradict one another.

I think I figured it out when I was writing this question, but wanted to confirm. Of the 16 possible ways that $4$ coin flips can go, $2$ have $3$ consecutive heads (below). So if I flip heads $3$ times and am about to flip it a fourth, there are two possible outcomes:

$$ % inner array of minimum values \begin{array}{c|cccc} \text{series} & 1 & 2 & 3 & 4\\ \hline 1 & T & T & T & T\\ ... & ... & ... & ... & ...\\ 15 & H & H & H & T\\ 16 & H & H & H & H \end{array} $$

making the probability of getting a fourth head $= \frac{1 outcome}{2 possibilities} = 50\%$. Is this correct?

3

There are 3 best solutions below

1
On BEST ANSWER

Yep, that's correct!

Here's another phrasing that may make it clearer:

Originally, you have 16 possible options for 4-flip sequences. Both HHHH and HHHT have the probability of 1/16, and so are all the others.

After you've flipped 3 heads, you've narrowed it down to just HHHH and HHHT, and the probabilities are equal.

The probability of HHHH is 1/16. The probability of HHHH given that you've already flipped 3 heads is 1/2. The difference between the scenarios is information - specifically, the information that you've already made some of the flips.

2
On

You are correct that the probability of a head on the fourth coin flip (regardless of the three previous outcomes) is $\frac{1}{2}$.

The apparent paradox arises because you are comparing a probability to a conditional probability for events that are not independent. The events in question are the event $A$ of getting 4 heads in a row and the event $B$ of getting heads on the first three flips. Clearly these are not independent events (indeed, if $A$ occurs then $B$ must occur, so $P(B|A)=1 \neq P(B)$). You are comparing $P(A|B)$ to $P(A)$, but we should not expect that these probabilities are equal as $A$ and $B$ are not independent events.

The probability of $A$, i.e., the probability of getting four heads on four independent coin flips is $$P(A)=P(4 \text{ heads})=\frac{1}{2}\cdot \frac{1}{2} \cdot \frac{1}{2}\cdot \frac{1}{2}=\frac{1}{16}.$$ However, the probability of $A$ given $B$, i.e., the probability of getting four heads on four independent coin flips given that the first three flips resulted in heads is $$P(A|B)=P(4\text{ heads}|\text{first 3 heads}) = \frac{P(\text{4 heads and first 3 heads})}{P(\text{first 3 heads})}= \frac{\frac{1}{2}\cdot\frac{1}{2}\cdot\frac{1}{2}\cdot\frac{1}{2}}{\frac{1}{2}\cdot\frac{1}{2}\cdot\frac{1}{2}}= \frac{\frac{1}{16}}{\frac{1}{8}}=\frac{1}{2}.$$

0
On

Your explanation is correct, but allow me to elaborate. I'll begin by writing down some formal definitions, and try to explain them. Then, I'll refer to your question.

Lets restrict our discussion to the finite discrete case, i.e. assume that you have a sample space $\Omega$ of finite size. $\Omega$ is the set of all possible outcomes of your experiment, e.g.

  1. If the experiment is tossing one coin, then $\Omega = \{H,T\}$.
  2. If the experiment is tossing four coin, then $\Omega = \{(a,b,c,d) : a,b,c,d\in \{H,T\} \} $.

A probability measure on $\Omega$ is a function $P:\Omega \to [0,1]$ that satisfies $\sum_{\omega \in \Omega} P(\omega) = 1 $, namely a function that gives each possible outcome of the experiment its chance of happening. A nice example is the uniform measure, that gives every possible outcome the same probability.

An event in $\Omega$ is just a set of possible outcomes, i.e. $A\subseteq \Omega$ is an event. When we ask for the probability of an event, we ask "what is the probability that the result of our experiment was in A", i.e. $P(A) \triangleq \sum_{\omega \in A} P(\omega)$. Observe that under the uniform measure, we actually get that $P(A) = \frac{|A|}{|\Omega|}$, and that actually might be easier to grasp.

We say that two events $A,B\subseteq \Omega$ are independent if $P(A\cap B) = P(A)\cdot P(B)$. In order to better understand this, consider the following example: If we toss two different fair coins independently, then the sample space is $\{ (H,H),(H,T),(T,H),(T,T) \}$, and since the coins are fair then every outcome has the same probability of $\frac{1}{4}$. Furthermore, since the coins are fair then:

  • The probability of the first coin landing on $H$ is $\frac{1}{2}$, i.e. $P(\{(H,H),(H,T)\}) = \frac{1}{2}$.
  • The probability of the first coin landing on $T$ is $\frac{1}{2}$, i.e. $P(\{(T,H),(T,T)\}) = \frac{1}{2}$.
  • The probability of the second coin landing on $H$ is $\frac{1}{2}$, i.e. $P(\{(H,H),(T,H)\}) = \frac{1}{2}$.
  • The probability of the second coin landing on $T$ is $\frac{1}{2}$, i.e. $P(\{(T,T),(T,T)\}) = \frac{1}{2}$.

We get that: $P($first coin $H$ and second coind $T) = $P($first coin $H) \cdot P($second coind $T)$. This captures the intuition that if the coins are tossed independently of one another then their outcomes should not "impact" each other.

The last definition I want to write is the definition of conditional probability. Suppose $A,B\subseteq \Omega$ are two events. The conditional probability of $A$ given $B$ tries to capture the following case: suppose I conducted my experiment and got a result, and you want to guess what result I got. More specifically, you want to guess whether or not I got a result within $A$. If I don't give you any other information, your best guess would be that in probability $P(A)$ the result is in $A$. However, I'm telling you that the result was in $B$. So now you know that my result wasn't just any result in $\Omega$, I've actually "narrowed" down your feasible sample space. Now I'm asking "given that the result was in $B$, what's the probability that it's also in $A$?". The intuition would say: $P(A|B) = \frac{|A\cap B|}{|B|}$ (read: the probability of $A$ given $B$ is ...), and in fact this is exactly the case of the uniform measure! In the general case, $P(A|B) = \frac{P(A\cap B)}{P(B)}$, as long as $P(B)>0$.

Actually, considering the definition of conditional probability, we get another, maybe more intuitive definition of independence of events: Two events $A,B \subseteq \Omega$ are independent if $P(A|B) = P(A)$, i.e. I can either tell you that the result of the experiment was in $B$ or not, and that won't change the probability of the result being in $A$.

That concludes the definition section. Now, let's tend to your question. In your experiment, you toss four fair coin independently. That, combined with the explanation above, gives us that $\Omega = \{(a,b,c,d) : a,b,c,d\in \{H,T\} \} $ with the uniform probability measure that gives every possible $(a,b,c,d)\in\Omega$ the same possibility of occurring, $\frac{1}{16}$.

You have presented two different questions. The first one was "What is the probability of getting 4 Hs in a row?". The answer to that is, as I mentioned above, $\frac{1}{16}$. The second question was: "Given that I've got 3 Hs, what's the probability that I also get the fourth H?". Using the notations above:

$P($fourth $H| 3$ consecutive $H) = P(\{(H,H,H,H)\}|\{(H,H,H,T),(H,H,H,H)\})$

Since we are considering the uniform measure, then the we can use the expression I wrote above, and get that:

$P($fourth $H| 3$ consecutive $H) = \frac{|\{(H,H,H,H)\}\cap\{(H,H,H,T),(H,H,H,H)\}|}{|\{(H,H,H,T),(H,H,H,H)\}|} = \frac{1}{2}$.

That is the formal way of explaining what you claimed.

Hope that helps ☺