Prove that $x^3 - 3x + c$ has at most one root in $[0,1]$, no matter what $c$ may be

421 Views Asked by At

Prove that $x^3 - 3x + c$ has at most one root in $[0,1]$, no matter what $c$ may be.

$f(x)$ is a decreasing function in the interval $[0,1]$ is evident by substituting the values of $0$ and $1$.

$f(0) = c \quad f(1) = c-2$

But my doubt is the values which $f'(x) = 3x^2 - 3$ takes are increasing.

$f'(0) = -3$
$f'(1/4) = -2.812$
$f'(1/2) = -2.25$
$f'(1) = 0$

This means the slope of $f'(x)$ is increasing in the interval $[0,1]$. Doesn't this mean $f'(x)$ is an increasing function in the interval $[0,1]$?
How is this possible? If $f(x)$ is decreasing then $f'(x)$ should be a decreasing, right? This is not happening in the above case. Please suggest some hints or point out where am I going wrong.

4

There are 4 best solutions below

0
On BEST ANSWER

If $f(x)$ is strictly decreasing then $f′(x)$ should be a decreasing, right?

No. $f(x)$ is strictly decreasing$^1$ if $f'(x) < 0$.

In your problem, we have $f'(x) = 3x^2-3 = 3(x^2-1)$ which satisfies $f'(x) < 0$ for $x\in [0,1)$, so $f$ is strictly decreasing there and thus injective. This actually holds for the closed interval $[0,1]$ even though $f'(1)=0$, due to the same grain of salt.


$^1$ With a grain of salt: $f(x)$ might be strictly decreasing even if $f'(x) = 0$, for example if $f'(x)=0$ only for isolated points like with $f(x) = -x^3$ and $f'(x) = -3x^2$.

4
On

We’ll go for a short contradiction.

Suppose, by way of contradiction, that $f(x)$ has at least two zeroes in $[0,1]$. Then, by Rolle’s Theorem ($f(a)=f(b)=0$ if $a,b$ are any two roots of $f(x)$ in $[0,1]$), there must exist a $k\in(0,1)$ such that $f’(k)=0$.

Thus there must exist a $k\in(0,1)$ such that $3k^2-3=0$. But $3k^2-3=0$ implies $k=\pm1$. Thus there exists no $k\in(0,1)$ such that $f’(k)=0$. Thus $f(x)$ cannot have multiple zeroes in $[0,1]$.

0
On

You don't care whether $f'(x)$ is increasing or decreasing $-$ all you have to know is that it is non-zero on $(0,1)$. But if you had two roots $u$ and $v$ in $[0,1]$, there would necessarily exist $\xi\in(u,v)$ such that $f'(\xi)=0$. Contradiction.

0
On

$$f(x)=x^3-3x+c \implies f'(x)=3(x^2-1) <0, \forall~ x \in (0,1).$$ So $f(x)=0$, will have at most one real root in $(0,1)$ irrespective of the value of $c$.

Additionally, if $f(0)f(1)=c(c-1)<0 \implies 0<c<2$, the equation will have exactly one real root in $(0,1)$.