Is post hoc bayesian reasoning valid?

34 Views Asked by At

I keep running into people making arguments where they infer things from the probability of events that have already happened. A classic example would be someone winning the lottery and then claiming divine intervention as the only explanation for them winning. Many conspiracy theories have a similar formula, and arguments for the simulation hypothesis can have a similar structure. The basic claim is that:

  1. Under a "standard understanding" of the world, the probability of the current state of affairs is extremely small.
  2. Therefore, there must be some factor outside this "standard understanding" (e.g. Odin, the CIA, the simulation architects) that is forcing the current state.

Arguments like this always struck me as very suspicious. To me, this is like drawing an arbitrary poker hand and then saying “the probability of this specific poker hand is very small, therefore the deck must be stacked in favor of it”. Another way would be to say that people are confusing P(A) with P(A | A). Event A has already occurred, so they are living in the world of P(A | A), but are thinking about it as if they are living in P(A).

I think what makes this confusing is we are often asked to make probabilistic inferences about the present based on past events. (E.g. “Is the defendant guilty?”) And the shift between a plausible argument P(X) < P(Y) => assume Y and an implausible argument P(X | X) < P(Y) => assume Y is subtle.

I’ve asked GPT and done some searching on this but there doesn’t seem to be a formal name for this kind of mistake. The people who do this are often highly intelligent, so I can’t dismiss them as careless or uncritical thinkers.

Is there a name for this? Am I making some reasoning mistake?

1

There are 1 best solutions below

3
On

This is just a consequence of not using a lot of data to update your beliefs as well as confirmation bias where we remember improbable things more distinctly.

If your prior beliefs are that winning the lottery is roughly one in a million and you buy one ticket and win, then updating your beliefs via Bayes' rule will likely result in a posterior that has shifted noticeably from the prior, hence divine intervention, conspiracy, etc. What you saw with your eyes differs from how you previously thought about the lottery, so something drastic must be happening. However, if you kept buying tickets, you would likely not win and your posterior would revert towards the true odds as you take more samples. More independent, identically distributed data will result in more accurate estimates of the true probability (assuming your prior beliefs weren't absurd).

For a concrete example, you can work something out involving flipping coins. See, e.g., this post. If you are a suspicious person and believe a priori that half of all coins are rigged to have two heads, then when someone flips a coin for you and it lands on heads, you then believe it is a fair coin with probability $1/3$. Moving from $1/2$ to $1/3$ belief in a fair coin is a big jump! But more samples from a fair coin will eventually bring your belief closer and closer to $1$.