I have to find the bit complexity, essentially the number of bit operations involved, in an algorithm to convert a number to its binary form. Here is the algorithm for a number n.
X = binary representation of 0. for i ← 1 to n
starting from right to left in X , find the first digit that is 0 and assume it is the kth digit
X ← flip the kth digit of X to 1 and flip 1,2,...,(k−1)th digit of X to 0 print X
We are thus incrementing by 1 n number of times. Essentially, the number of bit operations depends on the value of k in each iteration, but I can't seem to find a pattern for the first bit which is zero from the binary representation of 1,2,3...n. Any help?
The position of the first zero bit in $k$ (zero-indexed from the right; in other words, the number of $1$s the binary representation of $k$ ends with) is equal to number of times $2$ is a factor in $k+1$. (For example, $23 = 10111_2$ ends with three $1$s, and $23+1 = 24$ is divisible by $2^3$.)
There is no nicer formula for this number. The number of ending $1$s in $k=0,1,2,\dots$ is $$0, 1, 0, 2, 0, 1, 0, 3, 0, 1, 0, 2, 0, 1, 0, 4, 0, 1, 0, \dots$$ which you can sort of see a pattern in.
But for your problem, we don't really need a formula. Instead, it's enough to observe that:
And so on. So the average number of operations it takes to increment a value between $0$ and $n-1$ is better than $$ \frac12 \cdot 1 + \frac14 \cdot 2 + \frac18 \cdot 3 + \dots = \sum_{i=1}^\infty \frac{i}{2^i} = 2. $$ In other words, though some numbers take more operations than others to increment, and occasionally just before a power of $2$ you need lots of operations, the average number of bit operations to increment a number is $2$, a constant. Therefore the total number of bit operations to increment $0$ $n$ times is $O(n)$.