Okay I tried to convert 1 million to binary by dividing by a power of 2 and taking the remainder and dividing that by a power of 2 and so on and I got this: 1111010000100100000 Google says 1 million in binary is: 11110100001001000000
Why is mine with the modular arithmetic 19 bits and the google search for 1 million in binary coming up with 20 bits? Do they take it 1 step further to the half's place or something? Okay here is how I did it:
$1000000 \mod 2^{19} = 475712$ so 1
$475712 \mod 2^{18} = 213568$ so 1
$213568 \mod 2^{17} = 82496$ so 1
$82496 \mod 2^{16} = 16960$ so 1
$16960 \mod 2^{14} = 576$ so 01
$576 \mod 2^9 = 64$ so 00001
$64 \mod 2^6 = 0$ so 001000000