Is there some predictability in (pseudo-)randomness?

265 Views Asked by At

I was wondering if random numbers actually do have some predictability. For example, the digits of $ \pi$ are supposedly random. However, the frequency of some digits "lags" that of other digits. For example, if I take $1$ million digits of $\pi$, there are not $100,000$ of each of the digits $0$ thru $9$. Does that mean that possibly the lagging digits are more likely to occur to balance things out? If so, then it seems that the next "random" digit of $ \pi$ can be be predicted with greater than $10$% confidence.

It might be fun/cool if someone wrote a computer program and tried to predict the next digit of pi based on number frequency seen so far (biasing towards the lagging ones that are "due"). I wonder if after maybe $1$ million predictions if the % correct will exceed $10$% and by how much.

So my question is might this technique work to help predict "random" digits or has it been tried and it failed?

2

There are 2 best solutions below

6
On

I don't know about digits of $\pi$, but if you have a truly random generator, then "lags" of certain digits will not correct themselves.

By that I mean the following:

Say you have a random generator that returns integers from $0$ to $9$, and say that in the last one million tries, the generator generated $200.000$ nines. Then the probability of the generator generating another nine in the next step is still $\frac1{10}$, meaning that the probability that there will be less nines than other numbers in the next million integers generated is exactly the same as it would be if the generator produced zero nines in the first million tries.

1
On

I think the "pseudo-random" nature of the digits of $\pi$ is confusing the issue here. It's perfectly conceivable that the digits of $\pi$ behave in the fashion you indicate - that it would be more likely for a digit to show up if it hadn't been seen in a while. But if this were the case, the digits would not be behaving in a truly random way.

Let's say you flip a fair coin one billion times. In the real world, all coins are biased one way or another, but here we're doing math, so we can assume it is in fact perfectly fair. Let's say that the first 200 flips were all heads. The law of large numbers tells us that we should expect that the percentage of heads will tend to go downward toward 50%. But this should not be thought of as a mystical force that compels the coin to be tails - remember, we are assuming that each time we flip there is still a 50% chance of heads. Instead, consider that if the coin is fair, on the next $999,999,800$ flips we expect to find about $499,999,900$ and $499,999,900$ tails, giving a total of $499,999,900$ tails, $500,000,100$ heads. Since the heads have a head start, we do expect to see slightly more heads than tails overall. But compared to the large numbers at play here, the difference will be negligible: We would expect 50.00001% heads.

The point here is that anything that happens in a finite number of flips is going to be ultimately insignifant.