IID random variables $(X_n)$ have $\sum e^{X_n} c^n < \infty$ a.s.

916 Views Asked by At

I'm working on the following exercise:

Let $X_1, X_2, \ldots$ be i.i.d. nonnegative random variables. By virtue of the Borel-Cantelli lemma, show that for every $c \in (0,1)$, $$ \sum_{n=1}^\infty e^{X_n} c^n \begin{cases} < \infty \textrm{ a.s.} & \textrm{if } \mathbb E[X_1] < \infty; \\ = \infty \textrm{ a.s.} & \textrm{if } \mathbb E[X_1] = \infty \end{cases} $$

I'm trying to show $\sum_{n=1}^\infty \mathbb P\left[\sum_{k=1}^n e^{X_k} c^k \geq M\right] < \infty$ for some large $M > 0$. For then, Borel-Cantelli gives us that $$ \mathbb P\left[\limsup \left\{ \sum_{k=1}^n e^{X_k} c^k \geq M\right\}\right] = \mathbb P\left[\sum_{k=1}^\infty e^{X_k} c^k \geq M\right] = 0$$ and we're done. But I don't know how to show $\sum_{n=1}^\infty \mathbb P\left[\sum_{k=1}^n e^{X_k} c^k \geq M\right] < \infty$. Any suggestions?

2

There are 2 best solutions below

3
On

If $\mathbb{E}(X_1)< \infty$ then it follows from the strong law of large numbers that $S_n := \sum_{j=1}^n X_j$ satisfies

$$\lim_{n \to \infty} \frac{S_n}{n} = \mathbb{E}(X_1) \quad \text{a.s.};$$

hence

$$\lim_{n \to \infty} \frac{X_n}{n} = \lim_{n \to \infty} \left( \frac{S_n}{n} - \frac{S_{n-1}}{n} \right)=0 \quad \text{a.s.}$$

Consequently, there exists for almost all $\omega \in \Omega$ some $N \in \mathbb{N}$ such that $$\left| \frac{X_n(\omega)}{n} \right| \leq -\log(\sqrt{c}) \quad \text{for all $n \geq N$}$$ for fixed $c\in (0,1)$, and so $$\sum_{n \geq N} e^{X_n(\omega)} c^n \leq \sum_{n \geq N} \sqrt{c}^n < \infty.$$


If $\mathbb{E}(X_1)=\infty$ then

$$\sum_{n \geq 1} \mathbb{P}(X_n \geq n)=\sum_{n \geq 1} \mathbb{P}(X_1 \geq n) =\infty,$$

and therefore it follows from the Borel Cantelli lemma that $$\mathbb{P}(X_n \geq n \, \, \text{infinitely often})=1,$$ i.e. $$e^{X_n} \geq e^n \quad \text{for infinitely many $n$ with probability 1.}$$ This implies $\sum_{n \geq 1} e^{X_n} c^n = \infty$ almost surely for $c:= 1/e$.

1
On

This is an exercise from Achim Klenke's probability book, and it appear previous to the chapter about the laws of large numbers, so we need to handle it without this.

As $X_1$ is non-negative then from the previous chapter of the book we knows that $$ \mathrm{E}X_1= \int_{[0,\infty )}\Pr [X_1\geqslant t] \mathop{}\!d t\tag{*} $$ Then for any chosen $\epsilon >0$ we have that $$ \begin{align*} \epsilon \Pr [X_1\geqslant (n+1) \epsilon ]\leqslant \int_{n \epsilon }^{(n+1) \epsilon }\Pr [X_1\geqslant t] \mathop{}\!dt \leqslant \epsilon \Pr [X_1\geqslant n \epsilon ]\\ \therefore\quad \epsilon\sum_{n\geqslant 0}\Pr [X_1\geqslant (n+1)\epsilon ]\leqslant \mathrm{E}X_1\leqslant \epsilon \sum_{n\geqslant 0}\Pr [X_1\geqslant n \epsilon ] \end{align*}\tag1 $$

Now we can compare the value of $e^{X_k}$ with $c^k$, that is, if $$ \Pr [e^{X_k}\geqslant c^{-k} \text{ i.o. }]=\Pr [X_k\geqslant k\log(c^{-1})\text{ i.o. }]=1\tag2 $$ for any chosen $c\in(0,1)$ then this would imply that $\sum_{k\geqslant 1}e^{X_k}c^k=\infty$ almost sure. Therefore if $\mathrm{E}X_1=\infty $ then from the Borel-Cantelli lemma and $(1)$ the conclusion follows.

Now, to prove the other assertion it would be enough to show that $$ \mathrm{E}X_1<\infty \implies \Pr [e^{X_k}< c^{-k/2}\text{ eventually }]=1\tag3 $$ However the last condition is equivalent to $\Pr [X_k\geqslant k\log(c^{-1/2})\text{ i.o. }]=0$, and this follows immediately again from $(1)$ and the Borel-Cantelli lemma. $\Box$