Question: If $f$ and $g$ are two positive, increasing functions such that $f(t)\leq g(t)$ for all $t>0$ and $X$ is a positive random variable, then is it true that for all $t>0$,
$$\frac{\mathbb{E}[f(X)]}{f(t)}\leq \frac{\mathbb{E}[g(X)]}{g(t)}$$
under some additional assumptions (like differentiability of $f,g$ and something else) which I would like to know to make this work? It doesn't hold in general as pointed out in the comment section.
My Work: I just hypothesized this after working on a problem in the source below - I proved that if $f(t)=t^q$ for $q\in\mathbb{N}$ and $g(t)=e^{\lambda t}$ for $\lambda>0$, the inequality holds for comparing moment and Cramer-Chernoff tail bounds. It also trivially holds if I consider $f(t)=t$ versus $g(t)=\alpha t$ for $\alpha\geq 1$.
Source: In "Concentration Inequalities" by Boucheron, Lugosi, and Massart, they have a problem (2.5) on proving that moment bounds are tighter than Cramer-Chernoff bounds. I was wondering if I could weasel an argument about $g(t)=e^{\lambda t}\geq t^q$ by Taylor expansion instead of the argument in the reference.