Prove, without using Rolle's theorem, that a polynomial $f$ with $f'(a) = 0 = f'(b)$ for some $a < b$, has at most one root

500 Views Asked by At

Prove the following without using Rolle's Theorem:

If $f$ is a polynomial, $f'(a) = 0 = f'(b)$ for some $a < b$, and there is no $c \in (a,b)$ such that $f'(c) = 0$, then there is at most one root of $f$ in $(a,b)$.

I've already proven this by contraction by assuming that there is more than two roots and showing that it contradicts Rolle's Theorem. Now i'm wondering how I could prove this without using Rolle's Theorem.

3

There are 3 best solutions below

0
On

Well, this is not an exact answer, but we can use the following thoughts to prove the statement without using Rolle's Theorem.

Since it is given that $f' \left( a \right) = f' \left( b \right) = 0$ and assuming that $f$ is a non - constant polynomial (otherwise, if it is a constant non - zero poynomial, then it has no roots and hence the statement is true), we can see that either $f \left( a \right) \geq 0$ or $f \left( a \right) \leq 0$. Without loss of generality, I will give the arguments for $f \left( a \right) \geq 0$.

If $f \left( a \right) = 0$, clearly, $a$ is a root. Now, since $f$ is a polynomial which is non constant, there is a neighbourhood of $a$ where $f$ is either increasing or decreasing. We shall deal with the case where $f$ is increasing (similar argument can be given for decreasing).

Now, suppose that the neighbourhood is smaller than $\left( a, b \right)$ and after that neighbourhood, the function again starts decreasing (or becomes constant). Then, $\exists c \in \left( a, b \right)$ such that $f' \left( c \right) = 0$, contradicting the hypothesis. Therefore, the neighbourhood is atleast as much as $\left( a, b \right)$. Since $f$ is increasing in this interval, and $f \left( a \right) = 0$, there cannot be any other root in this interval. Hence, $f$ has at most one root.

Now, the other case where $f \left( a \right) > 0$. Again, from the above arguments, $f$ is either increasing or decreasing in $f \left( a, b \right)$. Hence, if it is increasing, $f \left( b \right) > f \left( a \right) > 0$ and it has no root in $\left( a, b \right)$. Similarly, if it is completely decreasing, it can cross the $x -$ axis atmost once.

Therefore, in all the cases, we can say that $f$ has atmost one root.

7
On

Note that $f$ is smooth, and since $f'(x) \neq 0$ for $x \in (a,b)$ the intermediate value theorem shows that $f'(x)$ has the same sign for $x \in (a,b)$, so we can assume that $f'(x) > 0$ for $x \in (a,b)$.

Suppose $f(x^*) = 0$ for $x^* \in (a,b)$. Since $f(x) = \int_{x^*}^x f'(t)dt$ we see that $f(x) > 0$ for $x \in (x^*,b)$ and $f(x) < 0$ for $x \in (a,x^*)$. Hence $f$ has at most one zero in $(a,b)$.

Addendum: Note that for $p(x) = x^k$ we have $p(x) = p(x^*)+ \int_{x^*}^x p'(t)dt $ without using the fundamental theorem of calculus. Linearity shows that it holds for any polynomial.

0
On

(Based heavily on this answer.)

Suppose for the sake of contradiction that $f$ has two roots $c$ and $d$ in $(a,b)$. Since $f$ is a polynomial, it has finitely many roots. So we can suppose that $c$ and $d$ are consecutive roots of $f$: that is, $f$ is nonzero on $(c,d)$.

Since $f$ is a polynomial, we can write $$ f(x)=(x-c)^p(x-d)^qr(x) $$ for some polynomial $r$ which is nonzero on $[c,d]$. Since $r$ is a polynomial, it is continuous, so by IVT it must have the same sign on all of $[c,d]$.

Then

\begin{align} f'(x)&=(x-c)^p(x-d)^qr'(x)+(x-c)^pq(x-d)^{q-1}r'(x)+p(x-c)^{p-1}(x-d)^qr'(x)\\ &=(x-c)^{p-1}(x-d)^{q-1}[r'(x)(x-c)(x-d)+r(x)(p(x-c)+q(x-d))]\\ &=(x-c)^{p-1}(x-d)^{q-1}s(x) \end{align} for some polynomial $s$.

By direct computation, $s(c)=r(c)(c-d)$ and $s(d)=r(d)(d-c)$. But $r(c)$ and $r(d)$ have the same sign. Therefore $s(c)$ and $s(d)$ have different signs. So again by IVT there is some $x_0 \in (c,d)$ (and thus also in $(a,b)$) with $s(x_0)=0$, which also means $f'(x_0)=0$.