A winning wager that loses over time

899 Views Asked by At

This problem was posted in Scientific American (vol. 321.5, Nov 2019, p. 73), and it was troubling.

The game:

We flip a fair coin. If we flip heads we gain 20% of our bet If we flip tails we lose 17% of our bet. Starting bankroll: $100

Stipulations: We must bet all the chips we have, cannot reload, and we must play at least 10 flips.

Note- There's no minimum bet, in the sense that we can keep playing this game with an infinitesimally small chip stack. Since we're always betting a fraction of our bankroll, there's no risk of ruin.


The expected value is net positive. Using the first flip as an example: EV = (0.5 * $20) - (0.5 * $17) = + $1.50

For 10 flips considering all outcomes: EV = +16.05% https://docs.google.com/spreadsheets/d/15nXStFnsEHFU938erWaCKDVMmFdaOVeHNJ-taQZLcJs/edit?usp=drivesdk


However, there's another way to look at this. When we win our bet is multiplied by 1.2. When we lose our bet is multiplied by 0.83.

Let's say we win one, lose one: 1.2*0.83 = 0.996

If we win 5 flips and lose 5 flips in any order: 1.2 x 1.2 x 1.2 x 1.2 x 1.2 x 0.83 x 0.83 x 0.83 x 0.83 x 0.83 = 0.9802, for a net loss of -$1.98.

Losses "hit harder" than wins. From this perspective it looks like a disadvantageous game.

Put technically, the geometric mean is less than 1, which means we expect our bankroll to shrink on average.

In this game the Kelly Criterion says we should risk no more than 7.5%, however we're forced to risk more than double that, 17%. It's a well known principle that betting more then twice Kelly will result in a shrinking bankroll. This is just another way to say the Geometric mean of growth is less than 1.


Scientific American claims the casino will make money over time. They claim that this holds true even if you flip 100 or 10,000 times or more.

I can't wrap my head around the fact that a +EV bet could lose over time in a game with no risk of ruin. I suspect this has to do with "volatility drag". While I can understand the math, I can't intuitively wrap my head around it.

So my question is, should you play this game?

7

There are 7 best solutions below

4
On BEST ANSWER

Here are all the possible outcomes if you stop after ten tosses of the coin. I have rounded the numbers for display, but the calculations were at the full precision of the software on which they were computed. In the table, $H$ is a random variable equal to the number of heads in the sequence of flips, and $X$ is a random variable equal to the amount of your money on the table at the end of the sequence. The last number in the fifth or seventh column is the sum of the numbers in the column above it, so the last number in the fifth column is the expected value of the money on the table at the end of the game, computed in the usual way.

\begin{array}{ccccccc} h & P(H=h) & P(H\leq h) & X & XP(H=h) & \log_{10}X & (\log_{10}X)P(H=h) \\ \phantom{0} 0 & 0.000977 & 0.000977 &\phantom{0} 15.52 &\phantom{00} 0.02 & 1.1908 & 0.0012 \\ \phantom{0} 1 & 0.009766 & 0.010742 &\phantom{0} 22.43 &\phantom{00} 0.22 & 1.3509 & 0.0132 \\ \phantom{0} 2 & 0.043945 & 0.054688 &\phantom{0} 32.43 &\phantom{00} 1.43 & 1.5110 & 0.0664 \\ \phantom{0} 3 & 0.117188 & 0.171875 &\phantom{0} 46.89 &\phantom{00} 5.50 & 1.6711 & 0.1958 \\ \phantom{0} 4 & 0.205078 & 0.376953 &\phantom{0} 67.79 &\phantom{0} 13.90 & 1.8312 & 0.3755 \\ \phantom{0} 5 & 0.246094 & 0.623047 &\phantom{0} 98.02 &\phantom{0} 24.12 & 1.9913 & 0.4900 \\ \phantom{0} 6 & 0.205078 & 0.828125 & 141.71 &\phantom{0} 29.06 & 2.1514 & 0.4412 \\ \phantom{0} 7 & 0.117188 & 0.945313 & 204.88 &\phantom{0} 24.01 & 2.3115 & 0.2709 \\ \phantom{0} 8 & 0.043945 & 0.989258 & 296.21 &\phantom{0} 13.02 & 2.4716 & 0.1086 \\ \phantom{0} 9 & 0.009766 & 0.999023 & 428.26 &\phantom{00} 4.18 & 2.6317 & 0.0257 \\ 10 & 0.000977 & 1.000000 & 619.17 &\phantom{00} 0.60 & 2.7918 & 0.0027 \\ & & & & 116.05 & & 1.9913 \end{array}

So we see you have about a $62\%$ chance of losing money, although less than $38\%$ chance to lose more than two dollars. On the other hand you could win over $\$500$; the chance of winning at least $\$100$ is over $17\%$, while the chance of losing $\$100$ is zero. Add up the ordinary expected value of the game, and it comes out to an expected gain of about $\$16.05.$

But if we take the expected base-ten logarithm of the amount of your money on the table after ten tosses (the number at the bottom of the last column), we see that it is only about $1.9913,$ whereas the base-ten logarithm of your starting amount is exactly $2.$

What does this mean? It means that when you measure your outcomes according to a utility function that is not necessarily identical to the raw outcomes (the number of dollars on the table), the expected utility of the game can be positive or negative, depending on your utility function. If your utility function is $f(x) = x$, this game has positive expected utility, but if your utility function $f(x) = \log_{10} x,$ this game has negative expected utility. That is, $E(X - 100) > 0$ but $E(\log_{10} X - \log_{10} 100) < 0.$

Note that using a different base of the logarithm will apply a different scaling factor to the utility, but it will still be negative. I used base $10$ because it gives a nice round number for the initial utility, so it is easy to see when you gain and when you lose.

Note that if we apply logarithmic utility in this way to a game where you put $\$100$ on the table and then flip a fair coin once, double or nothing, you have a $\frac12$ chance to increase your utility by about $0.3$ and a $\frac12$ chance to decrease your utility by $\infty.$ This is an absurd result, and is due to the absurdity of using "logarithm of money on the table" as a utility function. A somewhat more reasonable utility function for a casino game is to assume that you have some reserves of some kind somewhere that you do not put on the table, and take the logarithm of the sum of your reserves plus the money on the table. Suppose we only count the money in your bank account as these "reserves", and assume you had only $\$1000$ at the start of the day and withdrew $\$100$ in order to play the game. Then the utility of your initial state is $\log_{10}1000 = 3,$ and the utility of your outcome is $\log_{10}(900 + X).$ We get the following results:

\begin{array}{ccccc} h & P(H=h) & P(H\leq h) & 900 + X & \log_{10}(900+X) & (\log_{10}(900+X))P(H=h) \\ \phantom{0} 0 & 0.000977 & \phantom{0} 15.52 &\phantom{0}915.52 & 2.9617 & 0.0029 \\ \phantom{0} 1 & 0.009766 & \phantom{0} 22.43 &\phantom{0}922.43 & 2.9649 & 0.0290 \\ \phantom{0} 2 & 0.043945 & \phantom{0} 32.43 &\phantom{0}932.43 & 2.9696 & 0.1305 \\ \phantom{0} 3 & 0.117188 & \phantom{0} 46.89 &\phantom{0}946.89 & 2.9763 & 0.3488 \\ \phantom{0} 4 & 0.205078 & \phantom{0} 67.79 &\phantom{0}967.79 & 2.9858 & 0.6123 \\ \phantom{0} 5 & 0.246094 & \phantom{0} 98.02 &\phantom{0}998.02 & 2.9991 & 0.7381 \\ \phantom{0} 6 & 0.205078 & 141.71 & 1041.71 & 3.0177 & 0.6189 \\ \phantom{0} 7 & 0.117188 & 204.88 & 1104.88 & 3.0433 & 0.3566 \\ \phantom{0} 8 & 0.043945 & 296.21 & 1196.21 & 3.0778 & 0.1353 \\ \phantom{0} 9 & 0.009766 & 428.26 & 1328.26 & 3.1233 & 0.0305 \\ 10 & 0.000977 & 619.17 & 1519.17 & 3.1816 & 0.0031 \\ & & & & & 3.0059 \end{array}

Since your starting utility was $3,$ this is an expected gain.

Note that one of the assumptions of the research on which the Scientific American article is based is that you're not just taking $\$100$ out of your bank account to play the game; you are effectively putting your entire wealth on the table. That's what makes the losses in that game so devastating. Compare this to a lottery where you have a $0.1\%$ chance to win a large multiple of the ticket price and a $99.9\%$ chance to win nothing. A lottery like this where the price of a single ticket is "everything you own" is a very different matter from a lottery where a ticket costs a dollar.

9
On

In your second approach, you are only comparing games that end up with an equal amount of winning as losing tosses. That's not entirely fair.

For instance, consider playing the game with two flips. Winning twice (total gain of $\$44$) hits a lot harder than losing twice (total loss of $\$31$), and this difference more than makes up for the tiny loss in case you get one win and one loss (total loss of $\$0.40$).

This holds up when increasing the number of tosses. For instance, for ten flips, six wins and four losses yield a net gain of $\$41$, while six losses and four wins only loses you $\$32.30$. The corresponding differences between the 7-3, 8-2, 9-1 and 10-0 results are even larger.

1
On

The first play is a loss:

100 - (100 * .17) = 83 .

To break even, the second play must be a win as

83 + (83 * x) = 100

with x = .2048

But x is only allowed as .20 .

Or to break even, the second play must be a win as

83.3333 + (83.3333 * .20) = 100 .

2
On

To answer "should you play this game" you should calculate the probability of each result. All that matters is the number of wins and losses. As you observe, if you win an equal number of times as you lose you will come out behind. With $10$ flips you will be down about $2\%$. The expected value is positive because you will be ahead much more if you get more than $5$ heads than you will be behind if you get less than $5$ heads.

The casino will lose money over time. For every two dollars bet they will pay out $0.2$ and take in $0.17$ for a loss of $0.03$.

People's reactions to winning and losing money are not linear. If the stake were one dollar I would find it inconsequential. Even if you get ten heads you only win a little over $5$. Whether to play depends on whether you enjoy it-the money is not the thing. As the stake rises you get into a regime where you are attracted by the positive expectation while not being scared off too much of the possibility of losing. When it rises more you can lose enough to matter to your life and probably you become risk averse and decline.

1
On

If you start with an initial bet $b$, the expected value for the amount on the table after $n$ fair coin tosses is

$$B_n=b\left(p^n+{n\choose1}p^{n-1}q+{n\choose2}p^{n-2}q^2+\cdots+{n\choose n-1}pq^{n-1}+q^n \right)$$

where $p=1.2/2$ and $q=0.83/2$ By the Binomial Theorem, this is

$$B_n=b(p+q)^n=b(2.03/2)^n=(1.015)^nb$$

which clearly goes to infinity as $n\to\infty$. Note, however, that

$$\begin{align} (2p)^{n-k}(2q)^k=(1.2)^{n-k}(0.83)^k\ge1 &\iff(n-k)\log(1.2)+k\log(0.83)\ge0\\ &\iff k\le {n\log(1.2)\over\log(1.2)-\log(0.83)}\approx0.49456\ldots n \end{align}$$

which means that the probability that a player winds up with a profit is less than one half. This accords with what the Scientific American article is actually about, namely the "inevitability" of wealth inequality: In this game, most people lose money, some make a little money, and a few make a lot of money. The OP's confusion seems to stem from reading a discussion of the article where people incorrectly paraphrased what the article says, rather than the article itself. The Scientific American article uses the image of a casino only to describe the rules of the game; it doesn't claim the game is a money maker for the casino.

0
On

According to me the contradiction between arithmetic and geometric mean is only apparent.

To solve it we need to use The Law of Great Numbers.

In the game you need to win 50.6% of bets to be positive. This value is constant over n.

Instead for The Law of Great Numbers the variance of distributions of Heads and Tails decrease according to following relation

(p*(1-p))/n where p=1/2

Also the the arithmetic expected value is about 1.5% about 3X the difference between the number of winning bets to be positive.

So we can fix a finite n to impose (p*(1-p))/n<2.53% (5*0.506 equal to 5 std over average that assure that is impossible to win over). But n is finite instead the bets are infinity so the expected value converge to zero.

0
On

S Spring has it right. I've also detected a pattern for figuring out how much one must win back after such a percentage loss, but expressed as a fraction. In this case, one needs a win of seventeen 83rds. :-)

The "trick" is that after the initial bet, the amount that you are taking a percentage of has changed. Have added a pic of a table that shows the percentages of gains required for "getting even" after a particular percentage loss.

My fractional trick is merely a pattern that I noticed. That 17/83 of mine is S Spring's .2048.

I'm sure I'm not the first to have noticed this "fractional trick", but haven't been able to Google up anything related to it. I'm certainly no mathematician and am embarrassed to be here in the presence of those who are. Though, would love to be pointed to a URL that might say something more about this fractional trickage.

table of percentage gains and losses