How do I prove that $f(x)=x^n+x^{n-1}+...+x^2-nx+1=0$ ($n\in\mathbb Z_{>2}$) has one and only one root in $(0,1)$?
My idea is assume there are two roots, $a,b$, then $f(a)-f(b) = 0$, then $\sum_{i=1}^{n} (a^i-b^i) = (n+1)(a-b)$. But then how do I continue to find contradictions?
By the Descartes's rule of signs
https://en.wikipedia.org/wiki/Descartes%27_rule_of_signs
our polynomial has two positive roots maximum.
But $1$ is a root and it's obvious that the second in $(0,1).$
Indeed, since $f'(1)\neq0$ we see that a multiplicity of the root $1$ is $1$.
Also, for $x>1$ we obtain: $$f(x)=(x^n-x)+(x^{n-1}-x)+...+(x^3-x)+(x-1)^2>0.$$