Find the cube root of unity. Hence, show that if $w^3=1$, then $1+w+w^2=0$.
The cube root of unity is $1, -\frac{1}{2}+\frac{\sqrt{3}}{2}$ and $ -\frac{1}{2}-\frac{\sqrt{3}}{2}$. If $w^3=1$ then $w^3-1=0 \implies (w-1)(1+w+w^2)=0$ . The task is to prove that the second bracket is zero but we will only be sure that the second bracket is zero if the first is not zero, right? I think there is some assumption here that $w\ne1$ that I'm missing but I think that because $w^3=1$, $w$ can be any of the three roots of unity including $1$.
The conditional is clearly not true for $w=1$. The hypothesis holds ($1^3=1$), but the conclusion does not ($1+1+1^2 \neq 0$). Therefore, you have to assume that $w \neq 1$ in order to conclude the proof.