This is closely related to the one on the link below, but I cannot really map it to mine...
I play a game, with probability $r$ the game adds $g$ more rounds. What is the expected number of rounds played?
So, I was trying to thinking it as the game in the link, and then considering that to win a player must win with a difference of $g$, but I am not finding a way to put it.
$h=\text{number of rounds played}$
$E(h) = 1 + (1-r)\cdot b + r\cdot c$
where $b$ is $E$(number of games played if on the last round the player didn't get $g$ extra rounds) and
$c$ is $E$(number of games played if on the last round the player did get $g$ more rounds)
But I have no clue if this makes sense because now I cannot model $b$ and $c$.
I have this feeling that this gets somehow recursive... like $b = g + (1-r)\cdot b + r\cdot c$
but I am quite stuck :S
I am assuming
Correct me if I am wrong.
I will also assume $rg<1$. If $rg>1$, there is a chance this game will never end, so the expected time is infinite.
Letting $E_n$ be the expected number of rounds when you have $n$ rounds to go, then $$ E_n=1+(1-r)E_{n-1}+rE_{n+g-1}. $$
Furthermore, $$E_n=nE_1,$$ because in order to start at $n$ rounds and end at $0$, you need to drop from $k$ to $k-1$ a total of $n$ times, for $k=n,n-1,\dots,1$, and dropping from $k$ to $k-1$ is the same as dropping from $1$ to $0$.
You can combine these two equations to solve for $E_n$. The result is $$ E_n = nE_1=n/(1-rg). $$ When $rg=1$, the game will certainly end, but the expected end time is infinite.
Here is another explanation why $E_n=nE_1$. Assume the counter starts at $n$. \begin{align} \text{number of rounds until counter hits $0$} &=\quad \text{number of rounds until counter hits $n-1$ }\\ &\quad+ \text{number of rounds after that until counter hits $n-2$ }\\ &\quad+ \text{number of rounds after that until counter hits $n-3$ }\\ &\quad\;\vdots\\ &\quad+ \text{number of rounds after that until counter hits $1$ }\\ &\quad+ \text{number of rounds after that until counter hits $0$ }\\ \end{align} The left hand side of the above equation is $E_n$ on average, while each summand on the right is $E_1$ on average; if the counter is at $7$, then the time spent waiting for the counter to reach $6$ for the first time is exactly like starting at $1$ and waiting for the counter to reach $0$, which is $E_1$.