For a programming exercise, I'm converting fractional numbers to their decimal representation. My test numbers happen to be "binary fractions" in the form:
$$\frac{1}{2^n}$$
For example:
$$ \frac{1}{2^{15}} = 0.000030517578125 $$
What baffles me is that the number of digits in the fractional part seems to be just $n$. In the example above, "000030517578125" has 15 digits.
I've tried this for $0 \leq n \leq 1000$ and the number of fractional digits always matched $n$.
Could someone explain this correlation to me?
It is easy to see that every time you multiply $\frac{1}{2} \cdot \frac{1}{2} \dots \frac{1}{2}$ in each multiplication you gain a significant decimal digit.
As an example just start multiply from the beginning:
$\frac{1}{2} = 0.5$ (1 digit)
$\frac{1}{2}\cdot\frac{1}{2} = 0.25$ (2 digits)
$0.25\cdot\frac{1}{2} = 0.125$ (3 digits)
$0.125\cdot\frac{1}{2} = 0.0625$ (4 digits)
Until you hit $\frac{1}{2^{n}}$ which has $n$ decimal digits
It is easy to see that every time you multiply $\frac{1}{2} \cdot \frac{1}{2} \dots \frac{1}{2}$ in each multiplication you gain a significant decimal digit.
As an example just start multiply from the beginning:
$\frac{1}{2} = 0.5$ (1 digit)
$\frac{1}{2}\cdot\frac{1}{2} = 0.25$ (2 digits)
$0.25\cdot\frac{1}{2} = 0.125$ (3 digits)
$0.125\cdot\frac{1}{2} = 0.0625$ (4 digits)
Until you hit $\frac{1}{2^{n}}$ which has $n$ decimal digits
EDIT:
Take the decimal part of $\frac{1}{2} = 0.5$. Then divide $\frac{5}{2}=2.5$ divide it by $10$ and you get $0.25$ which is $\frac{1}{2^{2}}$.
Now take the decimal part of the previous value $0.25$. Divide it by 2 = $\frac{25}{2} = 12.5$ divide it by 100 and you get $\frac{1}{2^{3}} = 0.125$
Take the decimal part of the previous $0.125$. Divide by 2 = $\frac{125}{2}=62.5$ and divide it by 1000, you end up with $\frac{1}{2^{4}} = 0.0625$