Using as an example for lotteries in general, the expected value of roulette is negative, which means in the limit of playing for eternity, one is expected to lose a certain amount of money for each dollar bet.
The expected value for the sequence is $ \lim_{n \to \infty } ( \frac{(\sum_n (-\text{loss})*p(\text{loss}) + (\text{win})*p(\text{win}))}{n} ) $.
How does this expected value change if the strategy is stopping after winning for the first time? Assume unlimited time and money like we would when talking about the limit above.
The expected value becomes
$\lim_{n \to \infty}(\frac{S_n}{n})$
Where
$ S_1 = (\text{win})*p(\text{win}) + (\text{loss})*p(\text{loss}) $
$ S_n = (\text{win})*p(\text{win}) + p(\text{loss}) * (\text{loss} + S_{n-1}) $
I feel too rusty to actually compute what that limit would be, and haven't been able to find the result that surely exists somewhere.
For european roulette when betting on colors, win=1, loss=-1, p(win) = 18/37 and p(loss) = 19/37, but I am more interested to see how the expected value looks like in terms of win, loss, p(win) and p(lose) rather than some concrete number.
If you bet one dollar until you win for the first time, the result is a simple geometric series. Or you use a simple equation: with roulette and two zeroes you either win the first time or not, so
that is,
Note that if your bets are going up too fast (like doubling after each bet) some formulas for expected values don’t work anymore.