Why don't $\sqrt{a*b}=\sqrt{a}*\sqrt{b}$ holds true when one of $a$ & $b$ is imaginary number?

179 Views Asked by At

I know that $\sqrt{1}=\sqrt{(-1)*(-1)}$ and we can't expand $\sqrt{(-1)*(-1)}$ as $\sqrt{-1}*\sqrt{-1}$ since the later part becomes $\iota$*$\iota$ which becomes -1 which is not equals the original number we are given-$1$.

But I want to know why we can't take square root of two imaginary numbers in product to be equals the product of square roots of two imaginary numbers by distributing the square root function over the two numbers under the square root function?

2

There are 2 best solutions below

3
On

It is a convention that when $x\geq 0,$ the notation $\sqrt x$ denotes the non-negative $y$ satisfying $y^2=x.$ The symbol $\sqrt z$ should be avoided with complex numbers as there is no convention, and so $\sqrt z$ is ambiguous. The $\sqrt x$ symbol distributes over multiplication within the non-negative reals. The reason that it doesn't do that in $\mathbb C$ is simply that it doesn't always do so, as your example shows.

0
On

Any nonzero number $z$ has exactly two square roots, which differ only in sign. For $z$ real, we denote the positive one by $\sqrt{z}$ and the negative one by $-\sqrt{z}$. For $z$ not real, there's no canonical choice, and so the notation $z^{1/2}$ is effectively meaningless. (There are ways of resolving this issue--- branch cuts, for example, but that's not relevant here.) What we can say is that if $u^2 = a$ and $v^2 = b$, then $(uv)^2 = u^2 v^2 = ab$. There's no guarantee, though, that $uv$ will be the particular choice of the two square roots you arbitrarily have in mind for $\sqrt{ab}$.