Bound for sum of iid Bernoulli distributions

258 Views Asked by At

Consider $X$ to be the sum of $n$ iid Bernoulli distribution with $p$. I'd like to show that $\forall t > 0$, $$ \Pr(X > (1+t) n p) < \left(\frac{e^t}{(1+t)^{1+t}} \right)^{np} .$$

What I have is: Consider any $t > 0$, and let $X_i$ to be $i$-th Bernoulli \begin{align*} \Pr(X > (1 + t) np) &= \Pr(e^{t X} > e^{t (1 + t) np}) \\ &< \frac{\mathbb{E}[e^{t X}]}{e^{t ( 1+ t) np}} & (\text{Markov})\\ &= \frac{\mathbb{E}[e^{t X_1} \cdot \ldots \cdot e^{tX_n} ]}{e^{t ( 1+ t) np}} \\ &= \frac{(1 - p + p e^{t})^{n}}{e^{t ( 1+ t) np}} & (X_i\text{ are iid}) ,\end{align*} where the last line uses the moment generating function (mgf) of $X_i$ (Bernoulli variable), the reason why I use mgf is that otherwise, I can not think of any way to introduce exponential in the RHS of the bound.

But I can proceed no further. Is there any key step I am missing? I really appreciate any hints. Thank you.

1

There are 1 best solutions below

2
On BEST ANSWER

This result is called Chernoff's inequality. The result actually holds more generally when the Bernoulli random variables are not identically distributed (independence is an important assumption though). For a proof, see pages 3 and 4 of this set of lecture notes. (I can add the details here if necessary.) Note that in their proof the Bernoulli's are not identically distributed, but in the i.i.d. case we have $\mu=\mathbb{E}X=np$.