Is there true randomness?

212 Views Asked by At

In a paper titled "Quantum Randomness: From Practice to Theory and Back", Cristian S. Calude concludes that there is no true randomness in numbers:

"The “magic” of the quantum technology capable of producing unbreakable security depends on the possibility of producing true random bits. What does “true ran-domness” 5 mean? The concept is not formally defined, but a common meaning is the lack of any possible correlations. Is this indeed theoretically possible? The answer is negative: there is no true randomness, irrespective of the method used to produce it. The British mathematician and logician Frank P. Ramsey was the first to demonstrate it in his study of conditions under which order must appear (see [24, 32]); other proofs have been given in the framework of algorithmic information theory [9]"

I just want to ask if this is actually true. Are all real numbers predictable, even the uncomputable ones? Or are there numbers that cannot be predicted? What about quantum mechanics? Is it an exception?

2

There are 2 best solutions below

0
On

The term "predictable" applies to some process that generates numbers, not to numbers themselves. And so we can talk about predicting a number generator.

Note however that the quoted text doesn't talk about things being predictable. It talks about correlations. And it is correct: given any data generator, it is impossible to generate data without correlations. And in fact, if you go to quantum realm it kind of becomes worse. Because at quantum level everything is related to everything. Or if we look at it from different perspective: the entire universe is a single enormous wave function. And so there are no "things" there is only one thing. And so everything around is a piece of one big thing. And the evolution of wave function in time can be predicted via Schrödinger equation. Of course that is practically impossible. This problem is especially painful when trying to create a quantum computer, due to phenomena of quantum interference and decoherence.

There are no exceptions to this. True randomness (in the sense of lack correlations) does not exist. But of course from practical point of view, it doesn't really matter, since a number generator does not have to be "true" to suit our needs.

0
On

I can't address the entire argument of the paper, but the appeal to Ramsey theory in particular is bullshit.

Ramsey theory is often described as proving that "in any completely random structure, you find some amount of order", but to treat this as making the sequence "less random" is only possible if you don't have any specific idea in mind of either randomness or Ramsey theory. To avoid starting with very technical situations, here is a simple Ramsey-theoretic result:

In any sequence of $2n-1$ zeroes and ones, there is a subsequence $S$ which consists of either $n$ zeroes, or $n$ ones.

Knowing where $S$ appears would indeed reveal a lot of information about our sequence, making it more predictable. Even knowing which of the two options happens will tell us a bit about our sequence: does it have more zeroes or more ones? But the problem is that there's many possibilities for $S$ - the randomness in which subsequence occurs exactly cancels out loss of randomness from knowing $S$. This can be quantified in a couple of related ways:

  • Information-theoretically: suppose that the original sequence is chosen uniformly at random. The Shannon entropy of the sequence is equal to the entropy of (the position and type of) $S$, plus the conditional entropy of the sequence given $S$, which we can think of as the "remaining randomness" in the sequence once we know what and where $S$ is.
  • In terms of compression: we can say that a sequence is not truly random if we can compress it to a shorter sequence. It's true that if I tell you where $S$ is, then giving that information lets us compress the sequence a lot - but the number of bits it takes to specify $S$ will exactly cancel out the savings.

The view of randomness via compression is profitable, and a pigeonhole argument will tell us that most data must be truly random; it is impossible for every length-$N$ sequence to be compressible to a shorter sequence, because there's not enough shorter sequences to go around.

The "Ramsey-theoretic" result I gave was silly, but talking about monochromatic cliques in a complete graph just increases mystification without changing the result. You can see a discussion of Ramsey's theorem itself in the same context here, and the same thing will happen for all other results in Ramsey theory as well.