I need help with this exercise:
Let $\alpha$ be a zero of $x^3+x^2+1$ in $\mathbb{Z}_2$. Show that $x^3+x^2+1$ splits in $\mathbb{Z}_2(\alpha)$. [Hint: There are eight elements in $\mathbb{Z}_2(\alpha)$. Exhibit two or more zeros of $x^3+x^2+1$, in adition to $\alpha$.]
My attempt:
Since our polynomial is irreducible in our starting field, we have that a basis is $1,\alpha,\alpha^2$. Our eight elements are then $0,1,\alpha,\alpha^2,1+\alpha,1+\alpha^2,\alpha+\alpha^2,1+\alpha+\alpha^2$.
I was able to show that $f(\alpha^2)=0$, because $f(\alpha)=\alpha^6+\alpha^4+1=\alpha^6+\alpha^4+1+\alpha^7-\alpha^7=\alpha^4(\alpha^3+\alpha^2+1)+1-\alpha^7=\alpha^4(0)+1-\alpha^7=1-\alpha^7$. Now $\alpha^7=1$, because the non-zero elements of a field, is a group under, field multiplication, where 1 acts as the group identity. Since we have 8-1 nonzero elements $\alpha^7=1$.
But now I get stuck when I try to calculate the last one. The trouble is that I need to reduce what I get to one of the possibilities: $0,1,\alpha,\alpha^2,1+\alpha,1+\alpha^2,\alpha+\alpha^2,1+\alpha+\alpha^2$.
For instance I get that $f(1+\alpha)=1+\alpha+\alpha^3$, which I do not how to reduce.
Any tips?
It is just a matter of repeating the trick you already know about. If $\alpha$ is a root of $f = x^{3} + x^{2} + 1$, then also $\alpha^{2}$ is, and then also $\alpha^{4} = (\alpha^{2})^{2}$. This is, as you already know, because of the binomial in characteristic two: $$ 0 = (\alpha^{3} + \alpha^{2} + 1)^{2} = (\alpha^{2})^{3} + (\alpha^{2})^{2} +1. $$ And of course $\alpha^{2}$ and $$ \alpha^{4} = \alpha \alpha^{3} = \alpha (\alpha^{2} + 1) = \alpha^{3} + \alpha = \alpha^{2} + \alpha + 1 $$ are in $\mathbb{Z}_{2}[\alpha]$.
Note also that the coefficient $1$ of $x^{2}$ is the sum of the roots (it is actually minus the sum, but signs here do not count). So once you know the roots $\alpha$ and $\alpha^{2}$, the third one will be $1 - \alpha - \alpha^{2} = \alpha^{2} + \alpha + 1$.