Upper bound on sum of i.i.d. random variables

1.5k Views Asked by At

Here's a problem I've been struggling with:

Let $X_1, X_2, X_3, \ldots$ be an i.i.d. sequence of random variables with finite moment generating function $M(t)$. Define the sum $S_n = X_1 + \ldots + X_n$. For any $a < \mu = \mathbb{E}X_1$, prove that $$\displaystyle \mathbb{P}(S_n \geq an) \leq e^{-nI(a)},$$ where $I(a) = \sup_{t>0}\{at -\log\ M(t)\} > 0$.

What I've been able to hash out so far:

  • I should be able to simplify $\log\ M(t)$ to something like $\mu t$; this will make it possible to express the right-hand side as some common distribution
  • $\displaystyle \frac{S_n}{n}$ converges in distribution to $\mathbb{E}X$, so the right-hand side should be expressible as something nice, like $\mathbb{P}(\mu n - an \geq 0)$ (not exactly that, but something simple).

I'm having trouble finding a way to rigorously express these intuitions about the problem. In any case, thanks for any and all help.

1

There are 1 best solutions below

1
On BEST ANSWER

Hint For any $t \geq 0$, we have by Markov's (exponential) inquality and the independence of the random variables,

$$\begin{align*} \mathbb{P}(S_n \geq a \cdot n) = \mathbb{P}(e^{t \cdot S_n} \geq e^{t \cdot a \cdot n})\leq e^{-a \cdot n \cdot t} \cdot \mathbb{E}e^{t \cdot S_n} = e^{-a \cdot n \cdot t} \cdot \prod_{j=1}^n \mathbb{E}(e^{t \cdot X_j}) \end{align*}$$

Now use that the random variables are identically distributed to obtain the estimate you are aiming for.

(As far as I can see, there is a typo in your question: Instead of $a<\mu$ it should read $a>\mu$; otherwise we cannot expect $I(a)>0$.)

By the way, this estimate is known as Chernoff bound. It leads to the topic of large deviations.