An intuitive proof about the expected number of trials it takes for an event to happen

130 Views Asked by At

Imagine there is a fair spinner that can land on any number from $1$ through $100$. It is not too difficult to verify that if you spin this spinner $100$ times, then the expected number of $5$'s, for example, is equal to $1$. This can be done using linearity of expectation. Let $X_i$ be an indicator variable that is equal to $1$ if the $i$th spin results in a $5$ and is equal to $0$ if the spinner does not land on a $5$. Then let $X$ equal the number of $5$'s that come up after $100$ spins. We know that $X=\sum X_i$, and so \begin{align} \mathbb{E}(X) &= \mathbb{E}(X_1)+\mathbb{E}(X_2)+\mathbb{E}(X_3)+\ldots+\mathbb{E}(X_{100}) \\ &= \underbrace{0.01 + 0.01 + 0.01 + \ldots + 0.01}_{\text{$100$ times}} \\ &= 1 \, . \end{align} I find this proof to be fairly easy to understand. It also means that the idea of expectation becomes intuitive to me. However, there is another conception of expectation that I find more difficult to get my head around. Imagine you spin the spinner as many times as are needed to get a $5$. Let's denote the number of spins as $Y$. It turns out that $\mathbb{E}(Y)$, just like $\mathbb{E}(X)$, is equal to $100$. However, I am unable to find a satisfying proof of this fact. I know you can use an infinite series to compute $\mathbb{E}(Y)$, but is there a proof like the one shown above that doesn't require much algebraic manipulation or number crunching? If it is possible to use linearity of expectation, then I would be very interested to see this.


I don't mind if the proof involves infinite series; however, is it possible to improve upon this?

\begin{align} \sum_{n = 1}^{\infty} \left(\frac{99}{100}\right)^{n-1} \cdot \frac{1}{100} \cdot n &= \frac{1}{100} \sum_{n = 1}^{\infty} \left(\frac{99}{100}\right)^{n-1} \cdot n \\[6pt] &= \frac{1}{100} \sum_{n = 1}^{\infty} (n-1)\left(\frac{99}{100}\right)^{n-1} + \left(\frac{99}{100}\right)^{n-1}\\[6pt] &= \frac{1}{100} \left(\sum_{n = 1}^{\infty} (n-1)\left(\frac{99}{100}\right)^{n-1} + \sum_{n = 1}^{\infty} \left(\frac{99}{100}\right)^{n-1}\right)\\[6pt] &= \frac{1}{100} \left(\frac{99/100}{(1-99/100)^2} + \frac{1}{1-99/100}\right)\\[6pt] &= \frac{1}{100} \left(9900 + 100\right)\\[6pt] &= 100 \end{align}

(This proof uses the 'Gabriel's staircase’.)

2

There are 2 best solutions below

0
On BEST ANSWER

One way to think of it is to think about the first spin.

Case 1 (probability $.01$): First spin is $5$. Then you are done with $Y=1$.

Case 2 (probability $.99$): First spin is not $5$, and you can expect $E[Y]$ additional spins to get your first $5$.

So $E[Y]=.01\cdot(1)+.99\cdot(1+E[Y])$.

Solving for $E[Y]$ in this equation gives $E[Y]=100$.

1
On

Well, it's easy enough with a series:

$$\sum_{i=1}^{\infty} i \times \mathcal{P}(\text{it takes exactly $i$ spins to get a $5$}) = \sum_{i=1}^{\infty} i \times \left(\frac{99}{100}\right)^{i-1} \times \frac{1}{100}$$

This is $\frac{1}{100} f'(0.99)$ where $$f(x) = \sum_{i=1}^{\infty} x^i = \frac{1}{1-x} - 1$$ That is readily calculated to be $100$.

I don't know if that's "not much number crunching" - it doesn't seem too onerous to me, though I will certainly admit that it is practically intuition-free!