This is exercise 2.2.10 present in the book High-Dimensional Probability, by Vershynin. Let $X_1,\ldots,X_n$ be non-negative independent r.v with the densities bounded by $1.$ Show that the MGF of $X_i$ satisfies $$ E \exp(-tX_i)\leq \frac{1}{t} $$ After that, deduce that for any $\varepsilon >0$, one has $$ P\left( \sum^n_{i=1}X_i \leq \varepsilon n \right)\leq (e\varepsilon)^n $$
Some help would be much appreciated. I was not able to prove event the first inequality. This question is present in the section dealing with Hoeffding's inequality, so it probably is used somehow.
The first inequality comes from the fact that they have densities bounded by 1:
$$ \mathbb{E}[e^{-t X_i}] = \int_0^\infty e^{-tx} p_i(x)dx \le \int_0^\infty e^{-tx}dx = \frac 1t. $$
For the second inequality, we can show the bound assuming the $X_i$ are independent. We have that for any $t > 0$
\begin{align*} P\left( \sum_{i=1}^n X_i \le \varepsilon n \right) &= P\left( \sum_{i=1}^n(-t X_i) \ge -\varepsilon nt \right) \\ &= P\left( \exp \left(\sum_{i=1}^n(-t X_i)\right) \ge e^{-\varepsilon nt} \right) \\ &\le e^{\varepsilon nt} \mathbb{E}\left[\exp \left(\sum_{i=1}^n(-t X_i)\right)\right] \\ &= e^{\varepsilon nt}\prod_{i=1}^n \mathbb{E}[e^{-tX_t}] \\ &\le e^{\varepsilon n t} \frac{1}{t^n} \\ &= \left( \frac{e^{\varepsilon t}}{t} \right)^n. \end{align*}
Now choose $t$ to minimize $\frac{e^{\varepsilon t}}{t}$. The value that minimizes $\frac{e^{\varepsilon t}}{t}$ is $t^* := \frac 1{\varepsilon}$, so we have
$$ P\left( \sum_{i=1}^n X_i \le \varepsilon n \right) \le \left( \frac{e^{\varepsilon t^*}}{t^*} \right)^n = (e \varepsilon)^n.$$