Assume $x^2+x+1=0$ --①
Thus $x+1=-x^2$ --②
Since $x=0$ is not a root of ①, thus divide both side of ① by x to get
$x+1+\frac{1}{x}=0$,
Since ②, we have $-x^2+\frac {1}{x}=0$
Thus $x^3=1$ and thus $x=1$.
I know that the other two complex roots of $x^3$ is root for the original equation, and I know there's something wrong with the dividing part, but what exact principle did it break?
The overall effect of your manipulations (dividing by $x$, subtracting, and multiplying by $x$) is to say $$x^2+x+1=0 \\ \implies (x-1)(x^2+x+1)=0 \\ \implies x^3-1=0 \\ \implies x^3=1$$ and there is nothing wrong with those as one-way implications.
That final equation gives has three complex roots, one of which is $x=1$. But that one is in fact not a root to the first equation. This spurious solution was introduced by the multiplication by $(x-1)$, which is $0$ when $x=1$.