Variance of first n binary digits of pi?

188 Views Asked by At

I recently asked myself a question but didn't have the slightest inkling on how to solve it.

Question

If we convert $\pi$ to a binary digit. We know the first $n$ digits the average number of $0$'s are the same as the average number of $1$'s. My question is what is the variance as a function of the first $n$ digits?

2

There are 2 best solutions below

2
On BEST ANSWER

Wikipedia gives us that the variance of a set of values is $\frac 1{n^2}\sum_i\sum_{j\gt i}(x_i-x_j)^2$. If there are $k$ terms that are $1$ and $n-k$ terms that are $0$, this gives $\frac {k(n-k)}{n^2}$ If about half the digits are $1$ this gives $\frac 14$.

0
On

This isn't a proof but a demonstration of what happens. The graph below shows the sample standard deviation and mean of the first $n$, inclusive, bits of $\pi$ in base $2$, for $1\leq n\leq 128$. The standard deviation of the bits up to $n$ appears to converge pretty quickly to $0.5$, which agrees with Ross Millikan's answer that the variance may converge to $\frac14=0.5^2$. The mean of the bits appears to converge to approximately $0.42$ but theoretically it would converge to $0.5$, if $\pi$ is normal.

mean and stdev of binary pi

The value of $\pi=11.001\ldots_{2}$ is from http://www.exploringbinary.com/pi-and-e-in-binary/ and the graph was made in GeoGebra 5.0.274.0-3D. The .ggb file is available from https://ufile.io/gp92i.