To find the intervals where $h(x)=\frac{f(x)}{x}$ is increasing or decreasing,given $f$ is a real valued function defined on $[0,\infty)$

39 Views Asked by At

If $f$ is a real valued function defined on $[0,\infty)$ such that $f(0)=0$ and $f''(x)\gt0 $ $ \forall x $,then Find the intervals in which the function $h(x)=\frac{f(x)}{x}$ is increasing or decreasing

My attempt: $$h(x)=\frac{f(x)}{x}$$ $$h'(x)=\frac{xf'(x)-f(x)}{x^2}$$ So, $$h'(x)=0$$ when $xf'(x)=f(x)$ or does not exist when $x=0$ and let $$g(x)=xf'(x)-f(x)$$ $$h''(x)=\frac{(xf''(x))x^2-2xg(x)}{x^4}$$ $$=\frac{x^2f''(x)-2g(x)}{x^3}$$ I don't know how to proceed further