I need to subtract
$$\begin{array}{r} & 1000000000.0000\\ -& 0.0001\\ \hline \end{array}$$
The trick is that I need to borrow a $1$ for the first $0 - 1\ldots$ But there are zeroes all the way next to that zero. I thought I could borrow the $1$ from the top number but that gives me the result $0.0001$ which is obviously wrong.
How do I solve this?
Binary arithmetic is a lot like decimal arithmetic, but simpler. It is simpler because you only need to work with two values of the digits, not ten.
In decimal arithmetic, how would you solve this:
$$\begin{array}{r} & 1000000000.0000\\ -& 0.0001\\ \hline \end{array}$$
Now just remember that when you subtract $1$ from $10_{10}$, you get $9$, but when you subtract $1$ from $10_{2}$, you get $1$. Or to put it another way, the next number after $1$ base two is $10_2$, the next number after $11_2$ is $100_2$, the next number after $111_2$ is $1000_2$, and so forth. So just as $100_{10} - 1 = 99_{10}$ and $1000_{10} - 1 = 999_{10}$, you should understand that $100_{2} - 1 = 11_{2}$, $1000_{2} - 1 = 111_{2}$, and so forth.
For a more complicated subtraction such as $101.100_2 - 1011111.001_2$, you should first notice that the result will be negative, since you are subtracting a larger number from a smaller one. How would you do this if these were base-ten numbers instead of base two? I do not think you would put the smaller number on the upper line and the larger number on the lower line in the grade-school subtraction algorithm. It would be much easier to write it like this:
$$\begin{array}{r} & 1011111.001\\ -& 101.100\\ \hline \end{array}$$
Just remember that what you are really trying to calculate is $- (1011111.001_2 - 101.100_2)$, so after you have written the result of the subtraction below the horizontal line you must copy it and put a negative sign in front of it.