I came up with a (probably unoriginal) paradox today, and was wondering how it might be resolved. Its approach to reasoning seems to resemble basic game theory techniques.
Suppose a casino game has an expected return rate of over 100% of the cost to play (in other words, the game is profitable long term). A player would want to play the game repeatedly, and continue to gain profit. However, due to the variance in the outcome of each individual round, the question is raised, "how much starting capital is needed to play this game and not go bankrupt, with probability P% (i.e., a 95% chance of not going bankrupt, or 99%, or whatever percentage the player is comfortable with)?"
Now suppose that for whatever value of P the player selects, math or simulation is used to determine that 21 USD is the minimum required starting capital to achieve that desired level of safety. Having only 20 USD is insufficient (only whole USD are used in the game). If we assume that the outcome of each round is independent of the last, it follows that when a new round begins, if the player's current capital is 20 USD or less, then the player should stop playing.
However, once the player knows that he or she will walk away if at or below 20 USD, it becomes irrational to play the game with a starting capital of 21-40 USD. The probability of not going bankrupt should now be calculated as the probability of not declining to 20 USD capital (which we can call P'), as this will now be the trigger for the player to walk away from the game forever (comparable to true bankruptcy, with a 20 USD offset). P' will be very low (we might expect far less than 50%) when the starting capital is 21 USD, and P' at 40 USD will be equal to what P was at 20 USD.
Therefore you should not play the game with a starting capital of less than 41 USD. But then it also becomes irrational to play the game at 41-60 USD, because P'' is now the relevant probability to consider, and for that range, it is unacceptably low.
Continue with this pattern indefinitely.
The conclusion of the paradox seems to be that it is not rational to play any such casino game with a profitable expected return rate, if you desire any specific probability of success (whether it is 51%, 95%, 99%, etc.), and have any finite starting capital. Of course, this is absurd. Can anyone find a solution to this paradox?
This appears to be a paradox because your interpretation of the bankruptcy condition changes part of the way through your argument, especially when you say "once the player knows that he or she will walk away if at or below 20 USD, it becomes irrational to play the game with a starting capital of 21-40 USD". But there are at least two ways of avoiding this.
You could stop betting when the money you hold is positive but less than or equal to the minimum stake, i.e. you never bet enough that an adverse outcome in the next round reduces your holding to $0$ or less. If this leads to you stopping, then you are not bankrupt even though you may have lost money. This seems to be David K's point.
You could take to the casino an amount your are prepared to lose (leaving your main savings at home) in exchange for the positive overall expectation and the possibility of winning a potentially unlimited amount if you are willing to stay in the casino long enough
Taking the second possibility, let's suppose that each independent round gives you a $46.27\%$ chance of losing your stake and a $53.73\%$ chance of winning the same amount. That gives you a net expected gain of $7.46\%$ of your stake each round. Now let's suppose you go to the casino with $\$N$ and you decide to bet $\$1$ each time until you either lose all the money you started with or you gain some arbitrarily large amount (e.g. $\$1000$ or $\$1000000$ or whatever you have time for). We can now work out the probabilities that you lose all your initial $\$N$ before hitting your target gain, for some different starting amounts (the formulae for the gambler's ruin with a biased coin are well known):
So here the probability of losing everything is just over $5\%$ if $N=20$ and under $5\%$ if $N=21$. With $N=40$ you can square that to give just over $0.25\%$, and lower if $N=41$.
So the player in fact knows that starting with $\$21-40$ and following this strategy gives a probability of leaving with a very large amount of over $95\%$ (a satisfactory position in terms of how you began your question), and starting with $\$41$ or more gives a very high probability (an even more desirable position).
There is no paradox here so long as the player can tolerate a small risk of losing; if you are unwilling to contemplate any loss then you should not visit a casino at all.