$EX = 0$ and $\textrm{MGF}(t)= \infty$ for all $t>0$

158 Views Asked by At

This is a Durret exercise 2.7.5:

Suppose that $X_1,X_2,\ldots$ are iid with $EX_1 = 0$ and $E[e^{tX_1}] = \infty$ for all $t>0$. Let $S_n = X_1+\cdots+X_n$. Prove that for all $a>0$: $$\frac{1}{n} \log P(S_n \geq na) \xrightarrow{n\to\infty} 0.$$

I have the intuition for this exercise: $P(X\geq a)$ decays slow enough so that the moment generating function infinite. But i don't know how to start with this. Can you someone please help?

1

There are 1 best solutions below

0
On BEST ANSWER

Let $g\geq 0$ be a borel function, then for all $a> 0$, we have that: \begin{align*} E[g(X)] &= g(0) + \int_{0}^{\infty} g(t) P(X > t)dt \\ &= g(0) + \int_{0}^{\infty} g(t) P(X > t)dt \\ &= g(0) + \sum_{n\geq 0} \int_{n}^{n+1} g(t) P(X > t)dt \\ &\leq g(0) + \sum_{n\geq 0} \left(\sup_{n\leq t\leq n+1}g(t)\right) P(X>n) \end{align*} For $e^{\theta X}$, we get that: $$E[e^\theta X] = 1 + \sum_{n\geq 0}e^{\theta(n+1)}P(X>n) = \infty \implies \sum_{n\geq 0} e^{\theta n} P(X>n) = \infty.$$ Let $\epsilon>0$, then we must have $P(X_n>an) > e^{-(\theta + \epsilon)n}$ infinitely often since $$\sum_{n\geq 1} e^{-\epsilon n} < \infty.$$ Therefore, $$\limsup\limits_{n\to\infty} \frac{1}{n} \log P(X>n) \geq -(\theta + \epsilon).$$ Taking $\epsilon \to 0$ and then $\theta \to 0$ gives us that $\limsup\frac{1}{n} \log P(X>n) = 0$. This in turn gives us that $\lim\frac{1}{n} \log P(S_n>an) = 0$ for all $a\in\mathbb{R}$ by a lemma we proved in class.