I'm trying to understand a line in a proof.
We have positive $f$ with its derivative $f'(u) \to 0$ as $u \to \infty$.
We want to prove $\lim_{t \to \infty} \frac{f'(t)}{t} = 0$.
The proof says that as $t \to \infty$, $t^{-1} f(t) \sim t^{-1} \int^t_{z_0} f'(u)du$ where $a_n \sim b_n$ if $\frac{a_n}{b_n} \to 1$ and $f$ is absolutely continuous on $(z_0, \infty)$ with density $f'$.
Then the proof says since the integrand goes to zero so does the Cesaro average. Therefore
$t+xf(t) = t(1+x\frac{f(t)}{t}) \sim t$ which I do not understand. Could someone explain this line?
Below are screenshots for added context.
Resnick, Sidney I., Extreme values, regular variation, and point processes, Applied Probability, Vol. 4, New York etc.: Springer-Verlag. XII, 320 p.; DM 145.00 (1987). ZBL0633.60001.

