I'm having hard time because of this exercise, I have to implement an algorithm that repeatedly, through continuos divisions, from the remaining of the divisions I can find(looking backward the remaining) the integer in base 10.
2026-03-29 07:22:28.1774768948
On
Converting from base 2 to base 10 through division
1.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
2
On
To convert from base $10_2$ to base $1010_2$ by division, you keep taking remainder repeatedly:
$$\begin{array}{rcr} 1010\quad\underline{)\quad10\ 1011\ 0001\quad}\\ 1010\quad\underline{)\quad100\ 0100\quad}&\cdots&1001\\ 1010\quad\underline{)\quad110\quad}&\cdots&1000\\ 0\quad&\cdots&110 \end{array}$$ The first remainder, $1001_2=9_{10}$, becomes the least significant digit, and the last remainder, $110_2 = 6_{10}$, becomes the most significant digit: $$10\ 1011\ 0001_2 = 689_{10}$$
The prerequisite is, you are good at doing division and subtraction in binary.
Lets say you start with the number in base 2 called x.
You get the lowest order digit (right most digit) base 10 by taking x modulo 10 then set x=Floor[x/10] and repeat.
For example let x = 110011 = 51
51-10*5= 1
Or in binary $$\tag1 110011-1010*101 = 1 $$ so the first digit is 1 and we set x = 101 (from the previous step) and repeat
$$\tag2 101 - 1010*0 = 101$$ so the next digit is 5 then we set x = 0 and see that we can stop.
If it is clear how to obtain equations 1 and 2 then it should be clear how to implement this as an algorithm