What is the chance we flip the coin $10$ or more times?

62 Views Asked by At

I am interested in a more involved version of the following problem:

Suppose we flip a series of independent fair coins. For every head we add $1$ to the total. For every tails we add $0$ to the total. When the total reaches $10$ we stop flipping. With what probability do we flip the coin $20$ or more times?

In reality I only need an estimate for the chance we flip $20$ times. $20$ is an arbitrary number here, as I would like a bound that applies to any $N$ and the bound should shrink as $N \to \infty$. If possible I'd like the bound to be in terms of the variance of the variables.

I know how to use concentration results to get facts like the following: If I flip $100$ coins $X_1,\ldots, X_{100}$ then for $S =X_1 + \ldots + X_{100}$ Hoeffding's inequality says

$$P (S < 10) = P (S -50 < -40) = P (S -\mathbb E[S] < -40)$$ $$ \le \exp \left( - \frac{2 \cdot 40^2}{100}\right)= \exp \left( - \frac{3200}{100}\right) = e^{-32}.$$

So the probability of getting less than $10$ in $100$ coins is pretty darn small.

From this we can imagine it's very unlikely we'll flip the coin $100$ times in our original problem.

I can't see how to formalise this however, since we cannot easily define variables to capture the $n$th flip since that flip might not exist.

Could anyone give some advice or references on how to approach this type of problem?

4

There are 4 best solutions below

4
On BEST ANSWER

What you've described is the negative binomial distribution. We say $X$ has a negative binomial distribution $X \sim NB(r, p)$ if $X$ is the number of independent bernoulli trials of parameter $p$ until we get $r$ 'successes'. So in your example, if $X$ is the number of trials until we get 10 heads, $X$ will have $NB(10, 1/2)$ distribution.

To calculate its probability distribution, let's calculate the probability it takes $k$ trials to reach $10$ heads (note $k$ must be larger or equal than $10$ otherwise we can never get $10$ heads). In $k$ trials we need $10$ heads, so this means on the $k$th trial we flip heads. Then on the previous $k-1$ trials we need $9$ heads. The probability we flip $9$ heads in $k-1$ trials is just a binomial distribution, with probability $${{k-1}\choose 9}(1/2)^9(1-1/2)^{k-1-9}={{k-1}\choose 9}(1/2)^{k-1}$$ Then we need to multiply this by the probability the $k$th flip is a head, so the overall probability is $${{k-1}\choose 9}(1/2)^{k}$$

3
On

The probability of getting at least $m$ heads after $n$ flips is just $P(X \geq 10)$ where $X ~ Bin(n,\frac{1}{2})$. Does this help?

0
On

This is the probability that we get at most $9$ heads in $19$ tosses. That is, $$\sum_{j=0}^9\binom{19}j\frac1{2^{19}}$$

0
On

Introduce imaginary extra random variables.

We define a martingale difference sequence $V_1,V_2,\ldots $ as follows.

For each instance of the problem let $\tau$ be the number of flips and $x_1,x_2,\ldots, x_\tau$ the results. Note $n$ is different in each instance. Define $V_i = x_i -1/2 \ $for $i=1,2,\ldots, \tau$. For each $i > \tau$ define $V_i = 0$. It is an exercise to show this defines a martingale.

Assuming we flip $100$ times we must already have flipped $99$ times and gotten a total below $10$. That means $x_1+\ldots + x_{99} < 10$. But since $\tau \ge 100$ we have $x_1+\ldots + x_{99}=$ $ V_1 + \ldots + V_{99} +99/2$ and so $V_1 + \ldots + V_{99} <-79/2$. By Azuma-Hoeffding this happens with probability at most $$\exp \left( - \frac{2 \cdot (79/2)^2}{99}\right) \simeq e^{-31.5} \simeq 2 \times 10^{-14}.$$

Note this works for any random variables on $[0,1]$ with expectation $1/2$ and easily adapts to arbitrary ranges and expectations.