Recursive proof of expectation of geometric distribution?

338 Views Asked by At

If $T \sim \mathsf{Geo}(p)$ ($0 < p < 1$) then $\mathbf{E} T = 1/p$ is well known.

One "story" that captures this probablistic statement is the following:

Question. Suppose that I toss a sequence of coins that come up heads with probability $p$. On average, how many coins do I have to toss before I see a heads?

It is clear that if $T$ denotes the number of coins I have to toss, then $\mathbf{P}(T = k) = (1-p)^{k-1}p$, and hence $T \sim \mathsf{Geo}(p)$.

The typical proof simply does a computation of the arithmetic-geometric series to give the answer $1/p$. For example, if the coins are fair, I have to toss $2$ times on average.

I was wondering if there is a recursive way to prove this. It seems like there should be. For example, can someone give a combinatorial/probablistic interpretation to the following:

$$ \mathbf{E}T = 1 + (1 - p) \mathbf{E} T $$ It comes up in the typical proof (using series and expectation of the Geometric distribution).

1

There are 1 best solutions below

2
On BEST ANSWER

We can condition on the first toss, let $H$ denote the event that the first toss is a head.

\begin{align}E[T]&= E[T|H]E[H] + E[T|H^c]E[H^c] \\ &= p +(1+E[T])(1-p)\\ &= 1+(1-p)E[T] \end{align}

Solving for $E[T]$ gives us $\frac1{p}$.

Notice that $E[T|H^c]=1+E[T]$ as the first toss and the remaining tosses are independent.