During my Statistic course, we were asked the following question:
Let $ X_1, \ldots , X_n $ be a $n$ observations that are i.i.d and assume $ X_i \sim \mathcal{N} (0,\sigma^2) $. Use the Chernoff Bound, i.e.
$$ \Pr( X \geq t ) \leq \frac{E(e^{\lambda X})}{e^{\lambda t}} $$
And the fact that the Moment Generating Function of $X_i$ is
$$ M_{X_i} = E(e^{\lambda X_i}) = E(e^{\frac{1}{2} \sigma^2 \lambda^2}) $$
to prove that, for all $ t > 0$
$$ \Pr\left( \frac{1}{n} \sum_i^n X_i \geq t \right) \leq e^{-n \frac{t^2}{2\sigma^2} } .$$
Using the MGM of the mean, I have:
$$ \Pr\left( \frac{1}{n} \sum_i^n X_i \geq t \right) \leq \frac{e^{-n^2 \frac{1}{2}\sigma^2 \lambda^2 }}{e^{\lambda t}} $$
(If I didn't miscalculate something).
But I can't get any further...
Note that $\bar{X}=n^{-1}\sum_{i=1}^n X_i$ is normally distributed with $E\bar{X}=0$, $\text{Var}(\bar{X})=\sigma^2/n$ and moment generating function $Ee^{\lambda \bar{X}}=\exp(\frac{\sigma^2}{2n}\lambda^2)$. In particular for $\lambda>0$, the chernoff bound gives us that $$ P(\bar{X}\geq t)\le e^{-\lambda t}Ee^{\lambda \bar{X}}=\exp\left(\frac{\sigma^2}{2n}\lambda^2-\lambda t\right);\quad (\lambda>0)\tag{0} $$ Minimize the left hand side of $0$ over $\lambda>0$ by choosing $\lambda=tn/\sigma^2$ to get that $$ P(\bar{X}\geq t)\leq \exp\left(\frac{-t^2n}{2\sigma^2}\right) $$