$f : \mathbb{R} \to \mathbb{R}$ , Suppose $d =\inf_{x \in \mathbb{R}} f'(x)> 0$. Prove that $f(a) = 0$ for some $a \in \mathbb{R}$

122 Views Asked by At

Let $f : \mathbb{R} \to \mathbb{R}$ be a continuously differentiable function. Suppose $d = \inf_{x \in \mathbb{R}} f'(x)> 0$. Prove that $f(a) = 0$ for some $a \in \mathbb{R}$.

Attempt:

My idea is to prove that $f(x)$ takes both positive and negative values and then to use intermediate theorem to prove the existence of $a$.

On the contrary lets assume that $f(x) > 0$ for all $x \in \mathbb{R}$ , since $d>0$ it is a strictly increasing function hence $$\lim_{x \to -\infty} f(x) \to 0 \hspace{2cm} (1)$$.

It is given that the function is continuously differentiable

$$\lim_{x \to -\infty} \frac{f(x+h)-f(x)}{h} = f'(x) = 0 \hspace{2cm}(2)$$ In the above equation, due to (1) the numerator is bounded and the denominator tends to $\infty$, which contradicts with the assumption that $\inf_{x \in \mathbb{R}} f'(x)> 0$.$\hspace{2cm} (3)$

Similar argument can be given to prove that the function cannot take negative values alone.

Hence from IMVT, we can conclude that for such a function there exists an $a \in \mathbb{R}$ such that $f(a) = 0$.

Specifically my doubt is weather I can draw conclusions (2) and (3)

2

There are 2 best solutions below

3
On BEST ANSWER

By MVT, for any $b<a$ there exists exists a $b<\xi<a,$ such that $$\frac{f(a)-f(b)}{a-b}=f'(\xi).$$ So, $$\frac{f(a)-f(b)}{a-b}\geq d>0.$$ Since $a-b$ is positive, $$f(a)-f(b)\geq (a-b)d.$$

Now assume $f(a)>0$, then by letting $b$ become small enough, the RHS becomes very large, so, $f(b)$ must cross to negative value for this to b possible. So, we get change of sign in $f$, thus w must have a zero of it.

If $f(b)<0$, then let $a$ grow large and positive, then $f(a)$ must become positive to keep up with the RHS.

So, pick a random point, say, $0$, then either $f(0)=0$, which is what we wanted! Or it is either positive or negative. If positive choosing $a=0$ proves that $f$ has a root. If negative, choose $b=0$ and prove that $f$ has a zero.

Note: To come up with this, think about it geometrically, it tells you that you have at least d(b-a) vertical difference in values of $f$ at a and b. By making this difference large enough, we see that somewhere we have them on opposite sides of $x$-axis.

3
On

Without loss of generality, suppose that $f(0) > 0$. Put $\inf_\mathbb{R}f' = d>0$. Define the function $y(x) = f(0) + dx$. Graphically, $y$ is the line passing through $f(0)$ with $f$'s "minimum slope", so intuitively, $f$ should have a zero before $y$ does.

Put $a = d/f(0)>0$ and note that $y(-a) = 0$. We will show that $f$ has a zero in the interval $I=[-a,0]$.

Suppose that $f(x) > 0$ for all $x\in I$. Then, since $f$ is strictly increasing on $I$, $|f(-a) - f(0)| < f(0)$. Simultaneously, by the mean value theorem, $|f(-a)-f(0)| = f'(c)a$ for some $c$ between $-a$ and $0$. However, $f'(c) \ge d$, so that $|f(-a)-f(0)| \ge da = f(0)$.

Thus, we have that the two inequalities $|f(-a) - f(0)| < f(0)$ and $|f(-a) - f(0)| \ge f(0)$ hold at once, which is a contradiction. Thus, our assumption that $f(x) > 0$ for all $x\in I$ must have been faulty, and there exists a zero of $f$ in $I$, as desired.