The following question was given in the exercises in the martingale chapter of a textbook by Koralev-Sinai. I'm having trouble formalizing this problem in terms of martingale theory
Ann and Bob are gambling at a casino. In each game the probability of winning a dollar is 48%, and the probability of losing a dollar is 52%. Ann decided to play 20 games, but will stop after 2 games if she wins them both. Bob decided to play 20 games, but will stop after 10 games if he wins at least 9 out of the first 10. What is larger: the amount of money Ann is expected to lose, or the amount of money Bob is expected to lose?
My idea is to appeal to the optional stopping theorem, by defining a stopping time $\tau_{A}$ for Ann and a stopping time $\tau_{B}$ for Bob. But I'm not quite sure how to define these times. Any help or hints along these lines should be helpful.
Just thinking, if $X_{j}$ is the random variable representing the amount gained/lost at the $j$th game, that is taking $1$ with probability $.48$ and $-1$ with probability $.52$, then $$ \tau_{A} = \inf\{n : X_{n} = X_{n+1} =1 \}$$ and I think $\tau_{B} = 10$ if $\sum^{10}_{k=1}X_{k} \geq 9$? We can then account for them stopping at 20 games by taking the minimum with respect to a constant stopping of $T=20$.
The expected loss is directly proportional to the expected number of games played. Therefore, the person who plays, on average, fewer games is expected to lose less. Thus, it is clear that Ana is expected to lose less.