I've just been introduced to complex numbers, and I have found it surprising that the radical rule apparently holds even when one of $a$ and $b$ is a negative number. However, if both $a$ and $b$ are negative, then this rule doesn't work. Why is this?
Here is my attempt at proving the radical rule for positive $a,b$. I was wondering if this proof could be generalised for negative $a,b$, and whether this could form part of the explanation for when the radical rule holds. (Unfortunately, though, I am yet to learn about how the natural logarithm works when it can accept complex arguments.)
\begin{align} \sqrt a\sqrt b &= a^{\frac{1}{2}}b^{\frac{1}{2}} \\ &=e^{\frac{1}{2}\ln a}\times e^{\frac{1}{2}\ln b} \\ &=e^{\frac{1}{2}\ln a+\frac{1}{2}\ln b} \\ &=e^{\frac{1}{2}\ln(ab)} \\ &=(ab)^{\frac{1}{2}} \\ &=\sqrt{ab} \end{align}
I have also heard that part of the reason for why the radical rule only works in certain cases is because there is no way of ordering $i$ and $-i$. In other words, there is no way of saying that $i$ is 'greater than' $-i$, or vice versa. Taking this idea to the extreme, does this mean that we can't even say that $5$ is greater than $3$ when working with the complex plane?
If $x=\sqrt{a}$ and $y=\sqrt{b}$, then $(xy)^2\stackrel{(1)}{=}x^2y^2=ab$ and $xy\stackrel{(2)}{=}\color{blue}{\pm}\sqrt{ab}$. This reasoning works for both $\sqrt{a},\,\sqrt{b}\in\Bbb R$ and $\sqrt{a},\,\sqrt{b}\in\Bbb C$, because (1) uses commutativity (also needed on the display line below) and (2) uses the nonexistence of zero divisors, so that$$(u-v)(u+v)\stackrel{(1)}{=}u^2-v^2=0\implies u\mp v=0.$$To lose the $\color{blue}{\pm}$ when $\sqrt{a},\,\sqrt{b}\in\Bbb R$, we use the fact that then $a,\,b\ge0$, and their square roots are defined as the non-negative choices for $x,\,y$; then $x,\,y,\,\sqrt{ab}$ are all $\ge0$, so $\sqrt{ab}$ is $xy$ as opposed to $-xy$. In particular, this uses the fact that non-negative reals are closed under multiplication. But there is no analogous half $H$ of $\Bbb C$ in which we can place square roots, so that (i) for each $z\in\Bbb C\setminus\{0\}$ either $z\in H$ or $z\in-H$ but not both, and (ii) $z,\,w\in H\implies zw\in H$.