How many bit-flips are required to achieve random distribution?

181 Views Asked by At

If I have a binary number W bits wide, initially all set to zero, and I repeatedly pick a random bit and toggle it from zero to one or vice versa, how many times would I need to do this to achieve maximum entropy?

I hope I am using the term "maximum entropy" correctly -- what I mean is a point where the distribution of ones and zeros is as random as possible, and no amount of continued toggling will make the distribution any more random.

2

There are 2 best solutions below

0
On BEST ANSWER

Starting with $N$ zero bits and performing $F$ flips. Let's look at a single bit.

There are $(N-1)^{F-k}\binom{N}{k}$ ways that our bit can be flipped $k$ times. There are $N^F$ ways the random choices can go, so define $p_k$:

$$p_k = \frac{(N-1)^{F-k}}{N^F}\binom{N}{k}$$

The generating function $G(x)$ for the number of flips is:

$$G(x) = \sum_k p_kx^k = \left(\frac{N-1}{N} + \frac{x}{N}\right)^F$$

We're interested in distinguishing the cases where the bit is flipped an even number of times and an odd number of times. This is easy to do. For any polynomial $p(x)$, $\frac{p(x) + p(-x)}{2}$ creates a new polynomial containing only the even terms of $p$, and $\frac{p(x) - p(-x)}{2}$ contains only the odd terms of p.

$$p_\text{even} = \frac{G(1) + G(-1)}{2}\\ p_\text{odd} = \frac{G(1) - G(-1)}{2}$$

Obviously, $G(1) = 1$. But $G(-1) = \left(\frac{N-2}{N}\right)^F > 0$. So you will always have:

$$p_\text{even} = \frac12 + \left(\frac{N-2}{N}\right)^F\\ p_\text{odd} = \frac12 - \left(\frac{N-2}{N}\right)^F\\ $$

$$ \lim_{N \rightarrow \infty}\left(\frac{N-2}{N}\right)^F = e^\frac{-2F}{N}$$

0
On

After $N=2T$ toggles, the average value of any bit is $$\sum_{k=0}^{T-1}\frac{(W-1)^{2T-2k-1}}{W^{2T}}{2T\choose 2k+1}\\ =\frac12-\frac12\left(\frac{W-2}W\right)^{2T}$$
and the second term is roughly $\frac12e^{-2N/W}$. Use enough toggles to make that difference small enough.