How could I prove the roots of the equation $x^3=1$ add up to $0$? Geometrically it is not difficult to see they'll add to $0$, but when working it out algebraically I don't know how to add $e^{2\pi i/3}+e^{4\pi i/3}$.
Prove that the roots of $x^3=1$ add to 0
74 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 6 best solutions below
On
Let $\omega$ be a primitive third root of unity. We have $\omega^3=1$. Then by the finite geometric series $$ 1+\omega+\omega^2=\frac{\omega^3-1}{\omega-1}=0. $$ We do not even need to use $e^{i\theta}$. But $1,\omega,\omega^2$ are just the roots of $X^3-1=0$.
On
If $\omega\neq 1$ is a root of $x^3=1$, then $\omega^2\neq\omega$ is also a root of $x^3=1$ because $(\omega^2)^3 = (\omega^3)^2 = 1$. The same applies to $\omega^3$. The polynomial $x^3=1$ is of order 3 and has 3 distinct roots. The sum of roots is then $\omega+\omega^2+\omega^3$.
The root $\omega\neq1$ is similar to the primitive element of a finite group and its powers constitute all the distinct roots.
Look closely at $\omega(\omega+\omega^2+\omega^3) = \omega^2 + \omega^3 +\omega^4 = \omega^2 + \omega^3 +1\cdot\omega = \omega+\omega^2+\omega^3$.
If follows that either $\omega = 1$ or $\omega+\omega^2+\omega^3 = 0$.
Since $\omega \neq 1$, it leads to $\omega+\omega^2+\omega^3 = 0$.
On
Let $\omega, \omega^2, 1$ be the three distinct cube roots of unity. Then, we have
$$\omega(\omega^2 + \omega + 1) = \omega^3 + \omega^2 + \omega = 1 + \omega^2 + \omega$$
Since $\omega \neq 1$, it follows that $$\omega^2 + \omega + 1=0$$
For any monic degree-$n$ complex polynomial $$ f(x)=x^n+a_{n-1}x^{n-1}+\cdots+a_1x+a_0\\ =(x-r_0)(x-r_1)\cdots(x-r_n) $$ where the $r_i$ are the $n$ roots (possibly with repeats), we have $$ a_{n-1}=-(r_1+r_2+\cdots+r_n) $$ And in your case, you're after the sum of the three roots of the monic degree-$3$ polynomial $x^3-1$.