How to find optimal strategies for infinite betting games?

198 Views Asked by At

Suppose we have a game structure of the following form:

Game: You have an $n-$sided die. At any point in the game, there is a "value" associated to the rolls that have already occurred. You can choose to stop the game and accept the value, or you can choose to roll again.

The question is, how can one prove that a certain strategy is optimal (in the sense of expected value) for this game?

At each stage, you can calculate the expected value of one more roll. Of course, it always makes sense to roll if this value is positive. But it is not necessarily the case that if this value is negative you should stop, since you may get the opportunity to make more money with future rolls.

Some examples

Ex1: You flip a coin, and the value at any point is the fraction $\frac{H}{H + T}$.

Ex2: (Was a popular question here recently) Roll a 6 sided die, your value is the total, but you immediately 'bust' if your total is a perfect square at any point.

Slightly more formally, we could say:

Definition: A strategy for a game as described above is a decision to roll or stay at any state. A game is well-behaved if, among all strategies which stop with probability 1, there exists at least one which achieves a maximal finite expected value.

And so my (admittedly open-ended)

Question: Are there any strategies for proving that a game is well-behaved, and in particular for showing that a certain strategy is optimal?

I am particularly interested in Ex1 above. I believe that the optimal strategy is to stop whenever $\frac{H}{H + T} > 1/2$, which occurs with probability 1. But I do not have any idea how to prove that, beyond the fact that it is always a good idea to flip when $\frac{H}{H + T} \leq 1/2$, and that when $\frac{H}{H + T} > 1/2$ the next flip always has negative EV.