Suppose one plays the following game: Start with $n$ rounds. Each round, flip a weighted coin with probability $p$ of coming up heads. Each time we obtain heads, we get $m$ additional rounds added to our current total. We play until we run out of rounds. The expected number of rounds played is: $$ E[\# Rounds]= n + nmp + n(mp)^2 + n(mp)^3 +\dots $$ My question is how does one derive this expectation formula? (This is a question for work, not school).
Expected number of rounds played in a game
767 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 3 best solutions below
On
The expected number of added rounds won when playing 1 round is mp, so the expected number of added rounds won when playing x rounds is xmp.
E[#rounds] is the initial number of rounds (a=n), plus the expected number of rounds added when playing the initial a rounds (b=amp=nmp), plus the expected number of rounds added when playing the additional b rounds (c=bmp=n(mp)^2), plus the expected number of rounds added when playing the additional c rounds (d=cmp=n(mp)^3), etc.
On
This is not a complete answer, just some hints how to deduce the formula without resorting to some deep theories (this is similar in spirit to @Kevin's answer).
First you have to show that expected number $E_n$ of rounds is finite iff $mp<1$; there are several ways to do this (including the one given below for the computation of expectation).
Then you should observe that $E_n$ depends linearly on the initial number of rounds, i.e. $E_n = n E_1$.
Finally, conditioning on the first toss and using the law of total expectation, $$ E_1 = 1+ p \cdot E_m + (1-p)\cdot 0 = 1+ p \cdot mE_1 $$ (one toss plus the expected value if we get heads times the probability of getting a head plus the expected value if we get tails time the probability of tails).
Hence, for $mp<1$ (the finiteness of expectation is essential!), $$ E_1 = \frac{1}{1-mp}. $$
This is an example of branching process. At the beginning, we have $Z_0=n$ individuals. At the first step, each individual produces $X_{0,i}$ ($i=1,\ldots,n$) offsprings where $\mathbb P(X_{0,i}=m)=p$, $\mathbb P(X_{0,i}=0)=1-p$ and $X_{0,i}$ are independent. After the first step we have $$ Z_1=\sum_{i=1}^{Z_0=n} X_{0,i}, \quad \mathbb E[Z_1]=\sum_{i=1}^n \mathbb E[X_{0,i}] =n\mathbb E[X_{0,1}]=nmp.$$ At the next step, we have random number $Z_1$ of individuals. Each of them produces $X_{1,i}$ ($i=1,\ldots,Z_1$) offsprings where $\mathbb P(X_{1,i}=m)=p$, $\mathbb P(X_{1,i}=0)=1-p$ and $X_{1,i}$ are independent and do not depend on $Z_1$. After the second step we have $$ Z_2=\sum_{i=1}^{Z_1} X_{1,i}. $$ To calculate mean value of $Z_2$ we need Wald's equation $$ \mathbb E[Z_2]=\mathbb E[Z_1] \cdot \mathbb E[X_{1,1}] =nmp\cdot mp=n(mp)^2.$$ Next, $$ Z_3=\sum_{i=1}^{Z_2} X_{2,i}, \quad \mathbb E[Z_3]=\mathbb E[Z_2] \cdot \mathbb E[X_{2,1}] =n(mp)^2\cdot mp=n(mp)^3 $$ and so on. To get total expected number of individuals we need to sum up the infinite series $$ \mathbb E[Z_0+Z_1+Z_2+\ldots] = n+nmp+n(mp)^2+n(mp)^3+\ldots $$