Is it always true that if $a^2=b$ then $a=\pm\sqrt{b}$?

232 Views Asked by At

Is it always true that if $a^2=b$ then $a=\pm\sqrt{b}$ ? I've seen it stated that if $x^2=k$ then $x=\pm\sqrt{k}$ where k is real, but have not seen the more general case, so I'm wondering if there is a reason why not. If this is not true generally, what are some counter examples that could be shown on an Algebra 1 level?

Edit: A better statement of my original question might be this...

It appears that "if $a=b$ then $a+c=b+c$" is universally true regardless of context. Are there contexts in which $a^2=b$ could not be equivalently written as $a=\pm\sqrt{b}$ ? Would any such context be understandable on an Algebra 1 level? Perhaps where still with $a$ and $b$ limited as either complex numbers or expressions with complex coefficients.

Thank you!

3

There are 3 best solutions below

2
On BEST ANSWER

In any (commutative) field a quadratic equation like $x^2 = k$ can have at most $2$ solutions. If the field is the field $\Bbb{R}$ of real numbers, then we know that $x^2 = k$ has no solutions if $k < 0$, $1$ solution if $k = 0$ and $2$ solutions if $k > 0$. In the real numbers, we can also conveniently define $\sqrt{k}$ for $k \ge 0$ to be the non-negative solution of $x^2 = k$, so that the solutions of $x^2 = k$ are indeed $x = \pm \sqrt{k}$.

If we move to the field $\Bbb{C}$ of complex numbers, then $x^2 = k$ always has $1$ or $2$ solutions: $1$ if $k = 0$ and $2$ otherwise. In $\Bbb{C}$, we can still think of the solutions as comprising $\pm \sqrt{k}$, but the function $\sqrt{\cdot}$ is trickier to define (there are many possibly definitions, depending on the choice of what is called a branch cut).

If we carry on with the Cayley-Dickson construction and move to $\Bbb{H}$, the quaternions, which is a division ring (like a field but with a multiplication that is not commutative) then an equation like $x^2 = k$ typically has infinitely many solutions. In $\Bbb{H}$, it doesn't make much sense to try to define $\sqrt{\cdot}$.

I think the above goes beyond Algebra 1, but I hope it is of interest.

2
On

Yes the equation $a^2=b$ with $b \ge 0$ has always two solution for $b>0$ or one when $b=0$, indicated with $\sqrt b$ and $-\sqrt b$.

For a graphical interpretation, we can consider the equivalent problem for the intersection of parabola $y=x^2$ with the horizontal line $y=b\ge 0$ which has indeed solutions $x=\pm \sqrt b$.

enter image description here


Edit

When $b<0$ we need complex number and notably the imaginary unit defined by $i=\sqrt{-1}$, then we have

$$a^2= b=-1 \cdot (-b) \implies a=\pm i \sqrt {(-b)} $$

For a quadratic equation in the form $ax^2+bx+c=0$ or, more in general, for a polynomial equation $p_n(x)=0$ by the FTA, we always can find $n$ solutions in the complex field (counted with multiplicity).

1
On

Here is another (in my opinion) interesting approach to your question. However, I'm not sure if you're already familiar with this, so my answer might be underwhelming.

In your question you made it sound like $a$ has exactly two square roots if $a^2=b$, namely $\sqrt{b}$ and $-\sqrt{b}$. The following just illustrates that this isn't always the case:

Let $A \in \mathbb{R}^{n\times n}$ be a positive semi-definite matrix. Then it can be shown that there exists exactly one other positive semi-definite matrix $B \in \mathbb{R}^{n\times n}$, such that $A=B^2$, i.e. $B$ is a square root of $A$, and often times one writes $B = A^{\frac12}$. For example, you can check that $$ A := \begin{pmatrix} 5&4\\ 4&5 \end{pmatrix} \implies A^{\frac12}= \begin{pmatrix} 2&1\\ 1&2 \end{pmatrix} $$ and that these matrices are indeed positive semi-definite. Of course, as in the case of real numbers, if $B$ is a root of $A$, then so is $-B$. However, it is possible for matrices to have more than two roots. More so, the identity matrix $I_2 \in \mathbb{R}^{2\times2}$ has infinitely many complex roots, namely for example $$ I_2 = \begin{pmatrix} 1&0\\ 0&1 \end{pmatrix} = \begin{pmatrix} 1&z\\ 0&-1 \end{pmatrix}^2, $$ for all $z \in \mathbb{C}$. Moreover, there are also matrices that have no roots, such as $$ A= \begin{pmatrix} 0&1\\ 0&0 \end{pmatrix}. $$ Showing that this matrix indeed has no roots is a nice short exercise.

I hope this was helpful in some way!