Showing that a function does not have two distinct roots

95 Views Asked by At

I have a function $f(x)= x^3+\frac 32 x^2+\lambda$ where $\lambda$ is any real number and need to show that the function does not have two distinct roots in the interval $[0,1]$.

I know I am to use that $f'(c)=\frac {f(b)-f(a)}{b-a}$ but I am confused on how to use this to show that.

Thanks in advance.

3

There are 3 best solutions below

0
On

You can simply use that $x^3$ and $x^2$ are both strictly increasing for positive $x$. (What does this tell you about $f(x)$?)

1
On

If you want to follow the route of Rolle's Theorem (or Mean Value Theorem), suppose you had two roots $a,b\in [0,1]$. Put those into your equation. Can there be a $c\in (0,1)$ for which that holds?

0
On

Suppose for contradiction that $f(x)= x^3+\frac 32 x^2+\lambda$ has distinct roots in $[0,1]$.

Then there exist $a,b \in $[0,1]$ so that $f(a)=f(b)=0$.

Then by Rolle's theorem, $f$ achieves a maximum on the interval $(a,b)$. Let $x$ be the max in $[a,b]$

But $f^{\prime}(x)=3x^2+3x$.

So $0=3x(x+1)$.

Since $x=0,-1$ are the solutions...