(In binary environment)
0.100011 * 2^6 - 0.111001 * 2^3
= 0.100011 * 2^6 - 0.000111001 * 2^6
= 0.100011000 * 2^6 + 1.111000111 * 2^6 (convert left part into 2's complement)
= 10.011011111 * 2^6
However, answer was 0.011011111 * 2^6
Where did the first '1' go? Or am I right?
Sorry for lack of specific explanation about my question. But I got serious headache from solving so many question like this one.
Two's complement works for integers when you have a fixed storage space for the value. In this problem you are working with non-integers and don't appear to have a fixed length per value so I'd just use regular subtraction.
$$0.100011 \times 2^6 - 0.111001 \times 2^3$$
$$=100011 - 111.001$$
If it makes it easier you can use a pseudo two-complement approach to do the subtraction and write $111.001$ as $1000.000-000.111$
$$=100011-1000+0.111$$
$$=11011.111$$