How do I prove the multiplicative inverse in complex arithmetic?

2.5k Views Asked by At

I am working with Sheldon Axler's Linear Algebra Done Right and I could use some help from you guys. I have the following property of complex arithmetic:

for every $\alpha \in \mathbb{C}$ with $\alpha \neq$ 0, there exist a >unique $\beta \in \mathbb{C}$ such that $\alpha\beta = 1$

I want to prove this. I start by using de definition of multiplication of complex number.

$(a + bi)(c + di) = (ac - bd)+(ad+bc)i$

And the fact that:

$1 = (1 + 0i)$

Aaaand I run out of ideas. I really have no idea on how to continue so a clue would be greatly appreciated.

3

There are 3 best solutions below

1
On BEST ANSWER

Basic operations: $\;\alpha:=x+iy\neq0\iff x^2+y^2\neq0\;$ ( since $\;x,y\in\Bbb R\;$) , so:

$$\alpha\alpha^{-1}=1\implies\alpha^{-1}=\frac1\alpha=\frac1{x+iy}\cdot\frac{x-iy}{x-iy}=\frac x{x^2+y^2}-\frac y{x^2+y^2}i$$

and there you have a cartesian expression for $\;\alpha^{-1}\;$ whenever $\;0\neq\alpha\in\Bbb C\;$

0
On

You start from $(a+bi)(c+di) = (ac-bd)+(ad+bc)i$. Then impose the conditions $ac-bd = 1$ and $ad+bc=0$. From the second you get $d = -\lambda b$ and $c = \lambda a$ for some $\lambda$. Plugging these in the first equation you end up with $\lambda = \frac{1}{a^2+b^2}$ and so $c = \frac{a}{a^2+b^2}, d = -\frac{b}{a^2+b^2}$. So your inverse is $$\beta = \frac{a}{a^2+b^2}-i\frac{b}{a^2+b^2}.$$

0
On

You have the following:

$ac-bd=1$, Eq. 1

$bc+ad=0$, Eq. 2

Recognize this as a linear system of two equations in two unknowns $c, d$ and solve using linear combinations. But before you do so check the matrix of known coefficients:

$\left[\begin{matrix}a&-b\\b&a\end{matrix}\right]$

Observe that this guarantees a unique solution if and only if the determinant $a^2+b^2$ is nonzero, but for real numbers $a$ and $b$ this requirement is met unless $a=b=0$; the square of one nonzero number is (strictly) positive and the square of the other real number is never negative. Thereby, a unique solution is guaranteed for any nonzero complex number input $a+bi$.

Now for a linear combination solution. Setting the multipliers to make the terms containing $d$ cancel gives:

$a^2c-abd=a$, Eq. 1 × $a$

$b^2c+abd=0$, Eq. 2 × $b$

And then the sum contains only one unknown:

$(a^2+b^2)c=a, c=\frac{a}{a^2+b^2}$

Substitute this for $c$ in either Eq. 1 or Eq. 2 and you can then solve for $d=-\frac{b}{a^2+b^2}$.