The problem statement for Project Euler #323 is as follows:
Let $y_0, y_1, y_2, ...$ be a sequence of random unsigned 32 bit integers (i.e. $0 \leq y_i < 2^{32}$, every value equally likely).
For the sequence $x_i$ the following recursion is given:
$x_0 = 0$ and
$x_i = x_{i-1} | y_{i-1}$, for $i > 0$. ( | is the bitwise-OR operator)
It can be seen that eventually there will be an index N such that $x_i = 2^{32} -1$ (a bit-pattern of all ones) for all $i \geq N$.
Find the expected value of N. Give your answer rounded to 10 digits after the decimal point.
I'm not interested in the actual solution to the problem, but I wish to understand where my attempts have gone wrong. My attempt to the problem thus far has been as follows:
For any $y_i$, the chance of any bit being a 1 is $\frac{1}{2}$ as all values are equally likely. In other words, the chance that a bit will "flip" from a 0 to a 1 on the next turn is $\frac{1}{2}$.
Thus the expected number of 1s in $x_1$ is $32 \div 2 = 16$
Following this logic, the expected number of 1s in $x_2$ is 24, as half of the sixteen zeroes would flip.
Then the expected number of 1s in $x_3$ is 28, for $x_4$ it's 30 and for $x_5$ it's 31.
The expected value for the last bit is equivalent to flipping a coin until we hit a head (1), which would be $\sum_{1}^{\infty} \frac{n}{2^n} = 2$.
Therefore the expected value of N is 5 + 2 = 7.
However, apart from the fact that the answer is completely wrong, something makes me think that expected values just don't work that way. Can someone please clarify where I've made a mistake?
Disclaimer: Although I try to refrain from posting questions related to Project Euler on Math StackExchange, I believe that the answer to my problem would make it no easier for anyone else to solve the problem, and might in fact help others understand where they went wrong.
One thing that is going wrong is that at the informal level, "expected value $k$" is being identified with "value $k$." So for example at the almost end, when you think the expected number of $0$'s remaining is $1$, it is quite possible that there remains more than $1$, and also quite possible that there remain none. There is no reason to think that the expectations produced by these two cases will cancel.
It is not clear either what expectations even mean. When you say that the expected number of $0$'s after $k$ bitwise operations is $e$, are you conditioning on the event that the process has not terminated by time $\le k-1$?
In order to see what is going on, we will make a couple of calculations, with numbers far smaller than $32$. Imagine, for example, that the number $b$ of bits is $2$.
Then after one operation, with probability $1/4$ we will still be at $(1,1)$, with probability $1/2$ we will be at $(0,1)$ or $(1,0)$, and with probability $1/4$ we will be at $(0,0)$. Let $e$ be the expected number of operations used.
Thus with probability $1/4$ we have wasted a toss, and the expected number of tosses needed is $1+e$. With probability $1/2$, we have used $1$ toss, and the expected number of additional tosses is $2$. And with probability $1/4$ we do not need a second toss, so the expectation is $1$. It follows that $$e=\frac{1}{4}(1+e)+\frac{1}{2}(1+2)+ \frac{1}{4}(1).$$ Solve for $e$. We get $e=8/3$.
A calculation based on my understanding of your method would say that the expectation is $1+2$.