Isn't $\frac{ax+bi}{ax+bi}$ equal to $1$?

106 Views Asked by At

Isn’t $\dfrac{ax+bi}{ax+bi}$ equal to $1$? Here, $i=\sqrt{-1}$, & $a$,$b$ & $x$ $\in$ $R$

3

There are 3 best solutions below

1
On BEST ANSWER

Division in the field of complex numbers is defined in the following manner: $\frac{ax+bi}{ax+bi}=\frac{(ax+bi)(ax-bi)}{(ax+bi)(ax-bi)}$, so $\frac{ax+bi}{ax+bi}=\frac{a^2x^2+b^2}{a^2x^2+b^2}$, which is a real number and equal to 1. Hence your intuition is correct. But note that this is not true for $b=0$ and either $a=0$ or $x=0$ or both. Because $\frac{a^2x^2+b^2}{a^2x^2+b^2}$ is not defined when $b=0$ and either $a=0$ or $x=0$ or both.

1
On

If we assume that at least one of $ax$ and $b$ is non-zero, then yes. The complex numbers under the usual operations of addition and multiplication form a field.

0
On

In the complex field, every nonzero complex number $z$ has an inverse $z^{-1}$. It is customary to write $$ wz^{-1}=\frac{w}{z} $$ By definition, $zz^{-1}=1$, so in the alternative notation $$ \frac{z}{z}=1 $$ for every complex number $z\ne0$.

Now take $z=ax+bi$ and you have your result. Of course, if $ax+bi=0$ (that is, $ax=0$ and $b=0$), the expression $(ax+bi)/(ax+bi)$ is undefined.