Question : Show that all roots of $p(x)$ $=$ $11x^{10} - 10x^9 - 10x + 11$ lie on the unit circle $abs(x)$ = $1$ in the complex plane.
My progress so Far
I tried messing around with the polar form complex numbers but my proof turned out to be incorrect. Any help/suggestions would be greatly appreciated. I simply assumed all the roots were complex because the graph of the polynomial shows that there are no real zeroes. The rational root theorem fails and I tried to use the Fundamental Theorem of Algebra.


Hint: $\;p(x)=x^5\left(11(x^5+\frac{1}{x^5}) - 10(x^4+\frac{1}{x^4})\right)$. Let $z=x+\frac{1}{x}\,$, express $x^2+\frac{1}{x^2}=z^2-2$ etc in terms of $z\,$, and derive an equation in $z$.
[ EDIT ] The resulting equation in $z$ is the quintic $\,11 z^5 - 10 z^4 - 55 z^3 + 40 z^2 + 55 z - 20 = 0\,$ which can be shown to have $5$ real distinct roots in $(-2,2)\,$. For each of those $z$ roots, the corresponding equation $x^2 - z\,x + 1 =0$ will have a pair of complex conjugate $x$ roots because the discriminant $\Delta = z^2-4 \lt 0\,$, and since their product is $1$ by Vieta's relations, they will all have magnitude $1$ i.e. lie on the unit circle.