I am studying how computers are solving mathematical problems to avoid errors but there is a mathematical formulation I don't understand.
So lets say we have an equation $x^2-2bx+c=0$ where $c=1$, $b=10^{10}$ with two roots: $x_1=2 · 10^{10}$ and $x_2 = 5 · 10^{−11}$ which are correct to 16 digits.
Looking at my professors notes below I don't understand how "difference of squares formula" can be used in any way to get: $$ x_2 = \frac{c}{b+\sqrt{b^2-c}} $$

Observe that,
$$\begin{aligned} x_2 &= b - \sqrt{b^2 - c} \\ &= (b - \sqrt{b^2 - c}) \frac{b + \sqrt{b^2 - c}}{b + \sqrt{b^2 - c}}\\ &= \frac{(b - \sqrt{b^2 - c})(b + \sqrt{b^2 - c})}{b + \sqrt{b^2 - c}}\\ &= \frac{b^2 - (b^2 - c)}{b + \sqrt{b^2 - c}}\\ &= \frac{c}{b + \sqrt{b^2 - c}}. \end{aligned}$$