Uniqueness of the sum of roots of unity

100 Views Asked by At

Everyone knows that the sum of all roots of unity $r_k$ equals to zero: $$\sum^{n-1}_{k=0} r_k=\sum^{n-1}_{k=0} e^{\frac{i2k\pi}{n}} =0$$

Does anyone know if it possible to prove that the equation below:

$$\sum^{n-1}_{k=0} a_k r_k=\sum^{n-1}_{k=0} a_k e^{\frac{i2k\pi}{n}} =0$$ where $a_i\in N$ (non-negative integers)

has only one one solution: $a_0=a_1=...=a_{n-1}=constant$

Probably you can suggest me which direction to look in? May be Kummer rings theory can help or the simple prove exist.

2

There are 2 best solutions below

0
On BEST ANSWER

For $n = 4$, $(a_0, a_1, a_2, a_3) = (1,1,1,1)$ is a solution, and $(1,0,1,0)$ is another solution from Calvin Lin in the comment above.

So the sum of the two solution vectors $(2,1,2,1)$ is another solution that consists of only positive integers.

1
On

I'll make the assumption the coefficients are real (I think this is slightly more interesting posing an additional constraint), and that $n\ge 3$.

The idea linearity. Let \begin{align*} {\bf s} &= (1,\cos(2\pi /n),\dots, \cos(2\pi (n-1)/n))\\ {\bf i} &= (0,\sin (2\pi/n),\dots, \sin (2\pi (n-1)/n)) \end{align*}

represent the real vectors consisting of the real and imaginary parts of $(1, \omega, \dots, \omega^{n-1})$ where $\omega = e^{2\pi i /n}$.

Next define $T:{\mathbb R}^n \to {\mathbb R}^2$ by letting

$$ T {\bf a} = ({\bf a}\cdot {\bf s} ,{\bf a} \cdot {\bf i}),$$

where ${\bf a} = (a_0,\dots, a_{n-1})$. Then $T$ is linear and $\sum_{j=0}^{n-1} a_j \omega^j=0$ if and only if $T{\bf a}=(0,0)$.

Since $n\ge 3$, it follows that $T$ is onto. Indeed: $Te_1 =(1,0)$ and $T e_2 = (\cos 2\pi/n,\sin 2\pi/n)$, which is linearly independent of $T e_1$.

Therefore by the rank + nullity Theorem, the null space of $T$ has dimension $n-2$.

To find a basis for the null space, you can choose $n-2$ vectors of the form $(x_2,y_2,1,0,\dots,0)$, $(x_3,y_3,0,1,0,\dots,0)$, ..., $(x_{n-1},y_{n-1},0,\dots,0,1)$, where for each $j=2,\dots,n-2$, the constants $x_j,y_j$ are chosen so that the resulting vector is orthogonal to both ${\bf s}$ and ${\bf i}$. Solving the equations we have

\begin{align*} &y_j = -\frac{\sin (2\pi j/n)}{\sin (2\pi /n)}\\ & x_j = -\cos(2\pi j/n) +\cos(2\pi/n)\frac{\sin (2\pi j/n)}{\sin (2\pi /n)}. \end{align*}

In the case $n=4$, this gives the vectors $(1,0,1,0),(0,1,0,1)$