Expected value and Gambler's fallacy

925 Views Asked by At

Betting on a fair coin has expected value 0 dollars.

Suppose we win 1 dollar for each win and lose the same for each loss. Suppose we have lost 100 dollars so far. Then it's right to say that this loss has to be balanced out by the winnings somewhere in the future tosses of the coin? That's because the expected value is 0, so we can't remain at -100 dollars till infinity. But that also implies that the set of future tosses of the coin are overall biased towards winning, which is Gambler's fallacy. Please help.

3

There are 3 best solutions below

8
On BEST ANSWER

It's reasonable to assume all of the coin tosses are independent of each other, which also means they have no "memory" of past results, including things like that you've lost $100$ dollars so far. As such, from that point on, the expected value of the change in net amount will be $0$, so on average, you can expect to remain $100$ dollars behind although, obviously, it can't remain at exactly $-100$ dollars until infinity since each win or loss will increase or decrease your net value by $1$ dollar. Also, you're almost sure, if you play enough times and there's enough money available to cover any losses in the meantime, to be even at some point, and also ahead as well.

Also note the idea of having an expected value of $0$ from when you start playing is not the same as your situation. Your situation is conditional on, at some point, having lost $100$ dollars. The calculations for the expected value don't depend on, or account for, any such conditions occurring at any particular time. As such, you can't expect the later games you play to account for this so your net value will improve to be $0$ on average.

2
On

You’re mixing up various things here. The fact that the expected value is zero doesn’t imply that we end up at any particular point; it’s just an expected value. To disentangle the various facts involved, imagine that you only bet $2^{-n}$ dollars on the $n$-th bet. Then your total profit will never be $0$ after the first bet; it will always be positive if you win the first bet and always negative if you lose the first bet. Yet the expected value of each bet and of the sum of the bets is zero.

Nevertheless, it so happens that if you do always bet the same amount, you will return to a net profit of $0$ at some point with probability $1$. As the above example shows, this does not follow from the fact that the expected value is $0$. It’s a further fact that needs to be proved separately. It doesn’t imply that the future tosses are biased; in fact it holds for arbitrary amounts: Given any (not necessarily positive) integer profit $a$ you have obtained after some number of bets and any (not necessarily positive) integer target $b$, the probability that you will at some point have a profit of $b$ is $1$. So if you’re at a profit of $-100$, not only will you almost surely return to a profit of $0$, you will also almost surely reach a profit of $-200$, which shows that no bias is implied.

2
On

The other answers have done a good job of explaining why the Gambler's Fallacy is still a fallacy, but there is another problem with your reasoning that is more subtle that I'd like to address, by expanding upon the ideas in joriki's answer with a specific motivating example. The best time to get a correct idea in your head about how expected value of a random walk works is today!

Suppose we have lost 100 dollars so far.

Sure.

Then it's right to say that this loss has to be balanced out by the winnings somewhere in the future tosses of the coin.

Correct. This one-dimensional random walk will, if you continue it long enough, hit every possible value arbitrarily many times, so you will certainly balance out those losses by wins if you play long enough, with 100% certainty. Note that you need an arbitrarily large bankroll to keep playing because your losses are also unlimited.

That's because the expected value is 0, so we can't remain at -100 dollars till infinity.

This is the interesting point that I'd like to explore.

What you're getting at here is that you have to get back to zero eventually. That is correct in this case but it is not correct in general!

Suppose we have three kinds of money: dollars, pesos and yen, say. And now on every turn we do three coin flips, one for a dollar, one for a peso, and one for a yen. This situation is no different than before; the expected value of this game is zero dollars, zero pesos and zero yen. Suppose we are at -100 of all three kinds of money; what is the probability that we ever get back to exactly zero of all three kinds of money, even in an infinite number of coin flips? It is only one in three!

Just because the expected value of a game is (zero, zero, zero) does not mean that we by necessity must ever get there again. Expected value describes the average outcome of many games averaged together, not the individual current state of a specific ongoing game. In the average game, you'll win as many dollars, pesos and yen as you lose, but in any specific game, you will likely never get back to exactly even in all three currencies at the same time. And if you increase it to four, five, six currencies, it becomes unlikely in the extreme that you ever hit exactly even ever again.

Now, it is the case that in this game if you play long enough you will be up in all three currencies. My point is solely that you must not reason "I know the expected value is X therefore I will someday get back to X if I keep playing". That reasoning is false.

It is the case for the one-kind-of-money game that you get back to zero infinitely many times as you play infinitely long, but that is a fact about one-dimensional random walks, not a fact about the relationship between expected value and game state. Make sure you understand this distinction!

But that also implies that the set of future tosses of the coin are overall biased towards winning, which is Gambler's fallacy.

No, it does not. Suppose you are at -100, as you say, and you make ten million more tosses. At the end of those ten million tosses starting from -100, you are as likely to be at zero as you are likely to be at -200, and you are as likely to be at -1100 as you are to be at +900, and you are as likely to be at -102 as you are to be at -98.

There is no bias towards winning; rather, you are very slightly more likely to still be behind than ahead after n tosses for arbitrarily large n. Work out the math if you don't believe me, but how could it be otherwise? You can't be more likely to be ahead if you start from behind.