If $f(0)=0$ and $f''(x)$ exists for all $x>0$, then show that

104 Views Asked by At

If $f(0)=0$ and $f''(x)$ exists for all $x>0$, then show that $$ f'(x)-\frac{f(x)}{x} = \frac{1}{2} xf''(\zeta), \quad 0<\zeta <x $$ Also deduce that if $f''(x)$ is positive for positive values of $x$, then $f(x)/x$ strictly increases as $x$ increases.

For the first part i defined $ \phi(x) = xf'(x) -f(x) $. Applying Lagrange mean value theorem to it i got $ f'(x)-f(x)/x=xf''(\zeta)$. The factor of 1/2 is missing. Can someone point out the mistake. Also kindly help with second part of question that talks about strictly increasing $f(x)/x$

1

There are 1 best solutions below

0
On

Use Taylor's theorem to approximate $f(t)$ around $x$ by a linear polynomial. You'll get $$f(t) = f(x) + (t-x)f'(x) + r_1(t)$$ where $r_1(t)$ is the remainder. Using the Lagrange form of the remainder we can conclude that $$r_1(t)=\frac{1}{2}(t-x)^2f''(\zeta)$$ for some $\zeta$ between $t$ and $x$. Setting $t=0$ and combining everything gives the required identity.

For the second part, divide both sides of the equation by $x$ and notice that the LHS becomes the derivative of $\frac{f(x)}{x}$. Since $f''$ is assumed positive you immediately have the result.