I am working on the following probability problem from Durrett:
Suppose $(X_{i})_{i\geq 1}$ are i.i.d., $EX_{i}=0$, and $E\left(e^{\theta X_{i}}\right)=\infty$ for all $\theta>0$. Show that $\frac{1}{n}\log P\left(S_{n}\geq na\right)\to 0$ for all $a>0$.
I basically tried employing a similar strategy to one Durrett uses in one of his proofs by writing
$$ P\left(S_{n}\geq na\right)\geq P\left(S_{n-1}\geq -n\epsilon\right)\cdot P(X_{n}\geq (n+a)\epsilon). $$ If I could conclude somehow that the right-hand side of the inequality above is $$ \geq C P(X_{n}\geq (n+a)2\epsilon)\qquad(*) $$ for some constant $C\leq 1$, or some similar $\epsilon$ modification, I'd be able to finish the proof since $$ \limsup_{n\to\infty} P(X_{1}\geq (n+a)2\epsilon)^{n}=1 $$ by a Borel-Cantelli argument and the fact that $E(e^{\theta X_{i}})=\infty$ for all $\theta>0$. My problem is I don't know how to obtain a bound similar to $(*)$. In Durrett's proof, he states that it follows from the Weak Law of Large Numbers, but it is not clear to me how.
This indeed follows from the weak law of large numbers: note that $$\Pr\left\{S_{n-1}\geqslant -n\varepsilon\right\} \geqslant\Pr\left\{\left|S_{n-1}\right|/n\in \left[-\varepsilon,\varepsilon\right] \right\}=1-\Pr\left\{\left|S_{n-1}\right|/n \gt \varepsilon\right\}.$$ Since $\mathbb E\left[X_i\right]=0$ and $\left(X_i\right)_{i\geqslant 1}$ is an i.i.d. sequence, the sequence $\left(\left|S_{n}\right|/n\right)_{n\geqslant 1}$ goes to $0$ in probability and so does the sequence $\left(\left|S_{n-1}\right|/n\right)_{n\geqslant 1}$. Therefore, the quantity $\Pr\left\{S_{n-1}\geqslant -n\varepsilon\right\}$ is bigger than $1/2$ for $n$ large enough.