Exercise :
Let $f:(0,\infty) \to \mathbb R$ be a function such that $f \in C^2$, $\lim_{x \to \infty} xf(x) = 2$ and $\lim_{x \to \infty} xf''(x) = 0$. If $x>0$, prove that for some $ξ \in (x,x+1)$ it is $$xf'(x) = \frac{x}{x+1}(x+1)f(x+1) - xf(x) - \frac{1}{2}xf''(ξ)$$ and then calculate the limit $\lim_{x \to \infty} xf'(x)$.
Question/Request : I need some help, hints or tips on how to prove the formula in the center of the question. The final limit asked is simple as you just use the formula proven, so no help needed on that.
I don't know why you would write the identity in this peculiar why, but this is just Taylor's theorem on the interval $[x,x+1]$. Since $f\in C^2$, the theorem guarantees the existence of $\xi\in(x,x+1)$, so that $f(x+1)=f(x)+f'(x)+\frac12f''(\xi)$. Multiply by $x$ and you're done.