I've tried to convert 23 to binary and came up with the number 100111 by using the calculation inspired by this answer:
1) Find out the least significant bit:
$$ 23 = \underbrace{(e_n\times 2^n + ... + e_1\times 2^1)}_{22} + 1\times2^0 $$
Which is 1 here. Continue by shifting to the left by dividing by 2:
2) 22/2 = 10 + 1 // next bit is 1
3) 10/2 = 4 + 1 // next bit is 1
4) 4/2 = 2 + 0 // next bit is 0
So I'm left with the 2 in decimal, which is 10 in binary. Now I'm writing down the number:
10 plus the the bits from the operations 4, 3, 2, 1 gives me 100111, however, the answer is 10111. Where is my mistake?
Let's start with some powers of $2$:
We need to write out $23$, so this is enough powers of $2$.
Start with the highest power of $2$ that is equal to or less than the number. This is $16$. So write a $1$ down and subtract it out: $23-16 = 7$.
The next highest power, $8$, is greater than $7$, so write a zero: $10$.
The next one, $4$, is not greater than $7$, so subtract it out ($7-4=3$) and write a $1$: $101$.
The next one, $2$, is not greater than $3$, so subtract it out ($3-2=1$) and write another $1$: $1011$.
The last one, $1$, is not greater than $1$, so subtract it out ($1-1=0$) and write a $1$: $10111$, and we're done.