How can I evaluate the limit of $\theta$ in the expression $f(x+h)-f(x)=h f'(x+\theta h)$?

65 Views Asked by At

I'm working on a problem which asks to calculate the value of $\lim_{h\rightarrow 0} \theta$,where $\theta$ comes from the mean value theorem $f(x+h)-f(x)=hf'(x+\theta h)$,and $f$ is first order continuous differentiable. It's quite easy if $f$ is second order continuous differentiable by comparing the coefficient of Taylor expansion of $f(x+h)$ at $x$, from which I can obtain $\lim\limits_{h\to 0} \theta =\frac{1}{2}$. However, when I come to the first order case, I've got totally no idea about whether "pull out" $\theta$ from $f$ or construct a counter-example. So my question is does this conclusion still work in the first order case? How can prove it?

1

There are 1 best solutions below

1
On BEST ANSWER

The conclusion is not always true if $f'$ is continuous at $x$ but not differentiable at $x$.

For a counterexample, consider $f(x) = 2x \sqrt{|x|}$ at $x=0$.

The derivative is $f'(x) = 3\sqrt{|x|}$. For every real $h$, the function $\theta(h)$ must satisfy

$$ f(0+h)-f(0) = h f'(0+h \theta(h)) $$

$$ 2h\sqrt{|h|} = 3h \sqrt{|h\theta(h)|} $$

$$ \theta(h) = \frac 49 $$

$$ \lim_{h \to 0} \theta(h) = \frac 49 \neq \frac 12 $$

(Also as the question you linked in comments implies, the conclusion is not necessarily true if $f''(x) = 0$.)