Imagine a dice game where you may repeatedly roll a die until you either decide to stop, or roll a 1, with the following payoff function (where k is the number on the die),
$f(k) = 0$ when $k=1$
$f(k) = k^2$ when $k>1$
So if on your nth roll you land $1$, you go home with nothing. If on your $n$th roll you land a $4$, you get $16$ USD, and can either decide to roll again or go home with the $16$ USD.
My question: how should I set up the optimal stopping problem, after I have written down the transition matrix (making this into a Markov Process with each state space corresponding to the number I land)? Any hints would be appreciated, thank you
Suppose you have rolled a $4$, which gives a payout of $16$, should you roll again? The probability of getting a $j \in \{4,5,6\}$ later in the game is rolling $n$ times a $2$ or $3$ and then a $j$ for any $n \in \mathbb{N}$, which gives $$\mathbb{P}(\textrm{getting a }j)=\sum_{n=0}^\infty \frac{1}{3^n} \frac{1}{6}= \frac{1}{6} \frac{3}{2} = \frac{1}{4}.$$ So you have a $\frac{1}{4}$ chance of getting a lower payoff, and a $\frac{3}{4}$ chance of getting a better or equal payoff, which gives a payoff of $$ \frac{1}{4} ( 16+25+36) = \frac{77}{4} > 16.$$ So it seems like you should continue rolling.
If you get a $5$, at the same way you get by continuing to roll an average payoff of $$ \frac{1}{3}(25+36) = \frac{61}{3} < 25.$$ So if you get a $5$, you should stop.
Thus the optimal strategy is to keep rolling till you have at least rolled a $5$, and then stop.