Existence of an asymptote for $g(x)=\frac{f(x)f'(x)+f(1)f'(1)}{f'(x)+f'(1)}-f\left(\frac{xf'(x)+f'(1)}{f'(x)+f'(1)}\right)$

144 Views Asked by At

Working with the Slater's inequality (companion of Jensen's inequality) I find this statement :

Let $f(x)$ be a continuous,$n$ times differentiable ,convex and non constant on $(0,\infty)$ and increasing on $(1,\infty)$ and finally with non bounded derivatives then define : $$g(x)=\frac{f(x)f'(x)+f(1)f'(1)}{f'(x)+f'(1)}-f\left(\frac{xf'(x)+f'(1)}{f'(x)+f'(1)}\right)$$ With $g(x)$ strictly increasing on $(1,\infty)$.

Claim: $\lim_{x\to\infty}\frac{g(x)}{x}=constant$

I have not the key to approach the general case but let me try some example :

For my first example I take the exponential see here .The function can be decreasing or increasing on a such interval see as example the function $f(x)=x^x$.As particular case we have $f(x)=x+\frac{1}{x}$ this asymptote is constant.So in fact I have tried only elementary function and their compositions but it would be curious that it works only for them.

My question :

Have you counter-example (it would be perfect because I have some doubt on this statement) or a proof (which I think is not easy) ?

Thanks in advance !