Is it possible for the distribution of digits of an irrational number not to converge?

103 Views Asked by At

So I was in a discussion with somebody about the "randomness" of pi. I pointed out that it's actually not known if pi is a normal number (although all computation seems to indicate that). But then he made the statement "it obviously has some distribution." I'm not sure about that. Obviously, any finite expansion of the first n digits will have a well defined distribution, but I don't see any reason why we would be guaranteed that in the limit of the number of digits going to infinity, that the limit exists.

Reading up on the issue, I can't seem to find anything that says one way or the other. I've found that 'most' real numbers are in fact normal. This leads me to suspect that all numbers have a definite distribution.

To be a little more specific. Let's say we are given an irrational number. We can define a function on that number for a given base B. The function takes an input of n, which is the number of digits you want to display. The output of that function would be a B-dimensional vector that gives the distribution of each digit. We could also define similar functions for any finite string of digits. My question is, are we guaranteed for the limit of that function to exist as n goes to infinity?

If not, is it known if pi has a distribution that converges?

Thanks

2

There are 2 best solutions below

3
On

In general, limit digit distributions need not exist. Consider the digit sequence formed by concatenating longer and longer blocks of repeated digits, as 0 11 2222 33333333 ... where block $k$ has $2^k$ consecutive copies of digit $k\bmod 10$. At the end of the $k$-th block, there have been $2^{k+1}-1 $ digits in all, the last $2^k$ of which are the digit $k\bmod10$. Since $2^k/(2^{k+1}-1)> 1/2$ it is clear that there can be no limiting distribution: if there were, each of the 10 digits would occur at least half of the time.

In this construction there is nothing special about the sequence $k \bmod 10$. One could take any equidistributed sequence of base 10 digits (read off of a normal number, say) and repeat the block construction. This gives uncountably many digit sequences.

I have no idea if $\pi$ is "like" this.

0
On

Not at all. Consider the number $$0.100111111000000000000000000...$$ (which is obviously irrational). Informally, we have alternating "blocks" of $0$s and $1$s, and the $n$th block is twice as long as the sum of the lengths of the previous blocks. Basically, $0$ and $1$ each alternate between occurring $1\over 3$ of the time and $2\over 3$ of the time.


EDIT: in a comment, you ask:

Are there numbers whose distribution doesn't converge for all bases?

(I don't think that numbers this property have a name; I'll call them "absolutely strange.")

The answer is yes. Indeed, analogously to the situation with absolutely normal numbers, there is a precise sense in which "most" real numbers are absolutely strange:

While the set of absolutely real numbers has full measure (= the set of non-absolutely normal numbers has measure $0$), the set of absolutely strange real numbers is comeager (= the set of non-absolutely strange numbers is the countable union of nowhere dense sets).

  • A key point here is that category and measure are orthogonal: while both "comeager" and "co-null" are notions of mostness, they are quite different, and there are lots of examples of sets which are comeager but null (or meager but co-null).

In particular, such numbers exist in the first place.

The proof is a bit tedious to write out in full detail, but here's the rough idea. For $r$ a real number, $n$ a natural number, and $b>1$ a possible base, say that $r$ is $n$-strange in base $b$ if there is a sequence $$x_1^0<x_1^1<...<x_1^{b-1}<x_2^0<x_2^1<...<x_2^{b-1}<...<x_n^0<x_n^1<...<x_n^{b-1}$$

such that for each $0\le a<b$ and each $1\le i\le n$, digit $a$ occurs at least ${2^i-1\over 2^i}$ths of the time in the first $x^a_i$ digits of $r$ in base $b$.

It's easy to show that strangeness implies, well, strangeness:

Suppose $r$ is $n$-strange in base $b$ for every $n$ and $b$. Then $r$ is absolutely strange.

So it's enough to show that "most" reals (in terms of category) are $n$-strange in base $b$ for every $b,n$. And this is a consequence of the fact that we can always extend a finite sequence of digits to one exhibiting such a pattern (just add lots of strings of individual digits).

In fact, this argument proves something a bit stronger than what we really needed:

The set of numbers $r$ such that for every base $b$ and every digit $1\le a<b$ we have $$\lim\inf_{n\rightarrow\infty}{\mbox{number of $a$s in first $n$ digits of $r$}\over n}=0$$ but $$\lim\sup_{n\rightarrow\infty}{\mbox{number of $a$s in first $n$ digits of $r$}\over n}=1$$ is comeager.

So in terms of categry, "most" numbers are maximally absolutely strange.