Coin tosses and Conditional Expectation

690 Views Asked by At

Consider tossing a fair coin repeatedly. Let $X$ denote the number of Tails before the first Heads so that $X \sim \text{Geom}(1 / 2)$. Suppose we want to show that $\mathbb{E}[X \mid \text{first toss is Tails]} = 1 + \mathbb{E}[X]$.

Intutively, if the first toss is Tails, then we have wasted one toss and are back to where we started, by memorylessness. Therefore we have $\mathbb{E}[X \mid \text{first toss is Tails]} = 1 + \mathbb{E}[X]$.

We can also prove this more rigorously as follows: \begin{align*} E[X \mid \text{first toss is Tails}] & = \sum_{k = 0} ^ {\infty} k P(X = k \mid \text{first toss is Tails}) \\ & = \sum_{k = 1}^\infty k P(X = k \mid X \ge 1) \\ & = \sum_{k = 1}^\infty k P(X = (k - 1) + 1 \mid X \ge 1) \\ & = \sum_{k = 1}^\infty k P(X = k - 1) \text{ (by memorylessness property) } \\ & = \sum_{k = 0}^\infty (k + 1) P(X = k) \\ & = \sum_{k = 0}^\infty k P(X = k) + \sum_{k = 0}^\infty P(X = k) \\ & = E[X] + 1 \end{align*}

Now suppose we want to find $\mathbb{E}[W_{HH} \mid \text{first toss is T}]$, where $W_{HH}$ denotes the number of tosses until $HH$ appears. Using the same intuitive argument we can say that $\mathbb{E}[W_{HH} \mid \text{first toss is T}] = 1 + \mathbb{E}[W_{HH}]$. But is there a rigorous way to prove the same? I could not use an approach similar to the above one since $W_{HH}$ is not Geometric.

1

There are 1 best solutions below

0
On

$\newcommand{\PM}{\mathbb P}\newcommand{\E}{\mathbb E}$I don't know if you are waiting for this, but one can prove the claim easily using the theory on (discrete time) Markov processes.

Define the i.i.d. sequence $(B_k)_{k\geq 0}$ where $B_k \sim \text{Bernoulli}\left(\frac 1 2\right)$. Now define the two dimensional stochastic process $X_k$ by: $$X_{k}:= (B_k, B_{k+1})$$ This is clearly a homogenuous Markov chain.

Define the following hitting time $T$: $$T:=\inf\{ k \mid X_k = (1,1) \}$$ Now notice that $W_{HH}=T$. Moreover 'first toss is T ' is equivalent with $X_0 \in U$ where $U:=\{ (0,0), (0,1) \}$. One knows from the properties of a Markov chain that (see this): $$\E[T\mid X_0 \in U ] = 1+\E[T]$$ which is equivalent with: \begin{align*} \E[W_{HH}\mid\text{ first toss is T } ] = 1+\E[W_{HH}] \end{align*}