Let $\left\{X_i\right\}$ be a sequence of i.i.d. random variables and $S_n:=\sum_{i=1}^n X_i$. Then, for $\theta\ge 0$, we have $$ P(S_n\ge nx)\le e^{-n\sup_{\theta\ge 0}\left\{\theta x-\log M(\theta)\right\}}, $$ where $M(\theta):=E(e^{\theta X_1})$.
This is called Chernoff's bound.
I started to prove this as follows:
First of all, for $\theta\ge 0$, we have $$ \left\{S_n\ge nx\right\}\subset\left\{e^{\theta S_n}\ge e^{\theta nx}\right\}. $$
(For $\theta >0$, we even have equality.)
Hence, for $\theta\ge 0$, $$ P(S_n\ge nx)\le P(e^{\theta S_n}\ge e^{\theta xn}). $$ Since $e^{\theta S_n}$ is a non-negative random variable, we can apply Markov's inequality, giving $$ P(e^{\theta S_n}\ge e^{\theta xn})\le\frac{E(e^{\theta S_n})}{e^{\theta xn}}. $$ This holds for all $\theta\ge 0$. Hence, $$ P(S_n\ge nx)\le\inf_{\theta\ge 0}\frac{E(e^{\theta S_n})}{e^{\theta xn}}. $$ Now, I guess that $$ \inf_{\theta\ge 0}\frac{E(e^{\theta S_n})}{e^{\theta xn}}=e^{-n\sup_{\theta\ge 0}\left\{\theta x-\log M(\theta)\right\}}. $$
Unfortunately, I was not able to verify that yet. Can you tell me how to show that this is the infimum?
The $X_i$'s being independent (i) and identically distributed (ii), you get $$ \mathbb{E}\big[e^{\theta S_n}\big] = \mathbb{E}\big[e^{\theta \sum_{i=1}^n X_i}\big] = \mathbb{E}\big[\prod_{i=1}^n e^{\theta X_i}\big] \operatorname*{=}_{\rm(i)} \prod_{i=1}^n \mathbb{E}\big[e^{\theta X_i}\big] \operatorname*{=}_{\rm(ii)} \mathbb{E}\big[e^{\theta X_1}\big]^n = M(\theta)^n $$ so you get $$ \inf_{\theta\geq 0}\frac{\mathbb{E}\big[e^{\theta S_n}\big]}{e^{\theta xn}} = \inf_{\theta\ge 0}\frac{e^{n\ln M(\theta)}}{e^{\theta xn}} = \inf_{\theta\ge 0} e^{-n(\theta x - \ln M(\theta))} = e^{-n \sup_{\theta\geq 0} (\theta x - \ln M(\theta))} $$ where the last step uses the fact that $\exp$ is an increasing function.
Remark: this is not the most general statement of Chernoff bounds (which hold even for r.v's not identically distributed).