Is this a paradox about probability of a fair coin at very large numbers of flips?

771 Views Asked by At

The binomial formula for the probability of x heads on n flips with a probability of 0.5 is:$$\displaystyle \frac{\frac{n!}{x!(n-x)!)}}{2^{n}}$$

For the probability of getting exactly n/2 heads, this becomes: $$\displaystyle\frac{\frac{n!}{(\frac{n}{2})!(\frac{n}{2})!}}{2^{n}}$$

I don't know how to evaluate this in the limit that n approaches infinity, but I used a spreadsheet's binomdist function (x,n,0.5,false) to calculate the probability of exactly 50% heads on n flips (n = 2x), where x ranges from ten to a billion, with steps increasing by powers of 10. Up to 1 billion heads on 2 billion flips, the probability of the 50% heads decreases per the table below.

Heads Flips Probability
10 20 0.176197052
100 200 0.05634847901
1000 2000 0.01783901115
10000 20000 0.005641825312
100000 200000 0.001784121886
1000000 2000000 0.000564189513
10000000 20000000 0.0001784124094
100000000 200000000 0.00005641895828
1000000000 2000000000 0.00001784124116

It seems that the probability of an outcome with exactly 50% heads decreases as the number of flips increases. This makes sense to me because the number of categories of outcomes that are near (but not equal to) 50% heads increases with increasing numbers of flips, so some of the most central probability would be reapportioned to the near-neighbor categories. I also know that the likelihood of getting extreme results decreases as n increases, so that more of the probability is centralized around the 50% heads rather than the tails.

However, from another point of view, it seems surprising. If the trend continues in the limit that n approaches infinity, then the probability of getting exactly 50% heads is zero when the coin is flipped an infinite number of times. This contradicts how I had thought expectation values work. I had thought that by flipping a coin an infinite number of times, the limiting-case behavior equates to the expectation value. The expectation value of heads for the coin is 0.5, but the binomial function suggests that we never reach it.

I recognize that the probability distribution becomes increasingly centralized about 50% in a symmetrical way. However it seems that it never actually lands there, even after an infinite number of flips, which seems paradoxical.

Am I understanding this correctly?

1

There are 1 best solutions below

1
On BEST ANSWER

Stirling's approximation can be used to show that we have

$$\frac{ {2k \choose k} }{2^{2k}} \sim \frac{1}{\sqrt{\pi k}}$$

so you're correct that the probability of getting exactly 50% heads decays as $n \to \infty$ to zero but slowly.

How do we reconcile this with the intuition that as $n \to \infty$ we expect the fraction of heads to converge to 50%? In other words, does this contradict the law of large numbers? The answer is that it does not, because to ask for exactly 50% heads as $n \to \infty$ is much too strict (and that's not what the law of large numbers says). Here's an example of what we can prove: if we ask for the probability of getting between 49.9% heads and 50.1% heads, that probability converges to $1$ (very rapidly). More generally, for any $\varepsilon > 0$, the probability that the fraction of heads is between $\frac{1}{2} - \varepsilon$ and $\frac{1}{2} + \varepsilon$ converges to $1$. This is the weak law of large numbers.

To get even more specific than this, the central limit theorem characterizes the probability that the fraction of heads is between $\frac{1}{2} - \frac{x}{\sqrt{n}}$ and $\frac{1}{2} + \frac{x}{\sqrt{n}}$; the limit turns out to be a function of $x$ which grows towards $1$, so that e.g. we can find a specific value of $x$ such that this probability converges to more than 99%. Informally, the central limit theorem allows us to say that the fraction of heads is within some small multiple of $\frac{1}{\sqrt{n}}$ away from $\frac{1}{2}$ with high probability, which is pretty good for large $n$; e.g. when $n = 10^6$ this is about $\frac{1}{10^3}$.