How, precisely, is regression toward the mean different from the Monte Carlo fallacy?

108 Views Asked by At

In statistics, regression toward the mean arises when a sample point of a random variable approximates an outlier. A "future sample" is more likely (as I understand it) to be closer to the mean than a "typical" sample (or, is it just closer to the mean?).

The Monte Carlo fallacy is a similar concept, which I've heard worded a few different ways. Wikipedia describes it as: "... the erroneous belief that if a particular event occurs more frequently than normal during the past it is less likely to happen in the future (or vice versa)". The "events" in question are, of course, taken to be independent.


I don't quite understand how precisely these two ideas are meaningfully different. There's some subtle difference, sure. But shouldn't the first imply the latter? For example, consider a coin flip. Suppose an "extreme event" occurs, wherein $15$ flips, in succession, are all heads. We take the coin to be fair. By regression toward the mean, a future sample(s) is more likely to be closer to the mean than a "typical" flip. But, does this not imply the Monte Carlo Fallacy? If it is more likely that a future event will be closer to the mean, i.e. "non-extreme", is it not also "less likely [for the extreme event] to happen in the future"?

In a rigorous sense, how are these ideas different?