Converting $32$-bit $2$'s complement to decimal

682 Views Asked by At

What is the general procedure to convert a $32$-bit $2$'s complement number to decimal? For instance, if I was given the $2$'s complement representation:

$11111111111111111111111111101011$

how would I show that the decimal number corresponding to it is $-21$?

I tried to split it in the standard $1$, $8$, and $23$ format and got $255$ to be the number in the exponent as it was $11111111$. But I wasn't sure how to use this to prove that the exponent was $0$ and how to deal with the mantissa/decimal section.

Any help would be greatly appreciated.

2

There are 2 best solutions below

0
On BEST ANSWER

By definition of the two's complement, your code $$\color{red}{1}111111111111111111111111110101\color{green}{1}$$ represents the nonpositive integer $$\color{red}{-}000000000000000000000000001010\color{green}{1}$$$$=\color{red}{-}(2^4+2^2+1)=-21$$

The extreme left $ \color{red}{1}$ in the code, means the number is negative.

0
On

In 2's complement representation, if number starts with "$1$", then it is negative. To obtain its positive opposite number use well known $(\sim D+1)$. In your case it is $(10101)_2=(16+4+1)_{10}$ so is "$21$" and its opposite "$-21$".