Let $\alpha\in\mathbb{C}$ be a root of the equation $z^2+z+1=0~(1)$ and $\beta\in\mathbb{C}$ be a root of the equation $u^2-u+1=0~(2)$.
Prove that if $\alpha^{2n}+\alpha^n+1=0$ then $\beta^{2n}+\beta^n+1=0$ and vice-versa, $n\in\mathbb{N}$.
What I did:
$\alpha^2+\alpha+1=0~|\cdot(\alpha-1)\Rightarrow\alpha^3-1=0\Rightarrow\alpha^3=1$
$\Rightarrow \alpha \text{ is a primitive cube root of unity so the other root would be }\alpha^2$
$\alpha^{2n}+\alpha^n+1=0\Rightarrow\alpha^n\text{ is a root of (1)}\Rightarrow \alpha^n=\alpha~\lor~\alpha^n=\alpha^2\Rightarrow n\in\{3k+1,3k+2\}$
However, for $n=3k+1,\beta^{2n}+\beta^n+1=\beta^2+(-1)^k\beta+1$ and it will be $0$ only if $k$ is odd.
Am I doing something wrong or is there a problem with the statement?
Thanks in advance.
It's a problem with the statement, you are right. Just consider the case that $n=1$. We have, the well known form for $(1)$: $$\omega^2+\omega+1=0$$ However, $\beta=-\omega, -\omega^2$ doesn't satisfy $(2)$. Alternatively you could have approached the problem by using roots of unity, represented in Euler's form.