Understanding a definition of randomness

83 Views Asked by At

I happened across this definition of randomness in Wikipedia:

[A]n infinite sequence is random if and only it withstands all recursively enumerable null sets.

What does this mean?

If I recall my elementary Set Theory class correctly, this means that a sequence of numbers is random if at each iteration, the sequence generated is not a member of any set created by any previous iteration of the sequence. Is that a correct (and hopefully coherent) understanding?

Please help me understand this.

1

There are 1 best solutions below

3
On

This is not a set-theoretic definition of randomness, which is probably why you're having so much trouble. "Recursively enumerable" is a concept from computability theory: a set of finite binary sequences is recursively enumerable if there is a Turing machine (or just a computer program, if you're not familiar with Turing machines) which eventually outputs $1$ when given a string from the set, but never halts when given a string not in the set. The most complicated example is the halting problem: the set of binary sequences which represent computer programs that do not go into infinite loops.

In this context, a "recursively enumerable null set" means a sequence of recursively enumerable sets of finite binary sequences, so that the measure of these goes to zero - intuitively, think of it like these sets are telling you what the first few bits of your sequence might be, and they're getting progressively more and more specific as you go down the line. The sequence also has to be uniform (which just means that you can work out all of the recursively enumerable sets with a single program).

Finally, "withstands" in this context would be better described as "escapes". We've got this sequence of guesses that are getting gradually more specific; a sequence is random if every such sequence of guesses eventually has to commit to something wrong about the sequence.

So, in other words: A sequence is random if no program can give information about it that is both reliable and arbitrarily specific.

This all may have been too much to cover in one answer - I'm trying to give you the quick and dirty version here, but to really grasp this notion you need to have the equivalent of an introductory computability course under your belt.