Given the machine 32-bit word 1100 0001 1011 0000 0000 0000 0000 0000
can I find the decimal number represented by this word assuming that it is
(a) a two’s complement integer; (b) an unsigned integer; (c) a single precision IEEE 754 floating-point number?