Necessary and sufficient condition for convergence of series

507 Views Asked by At

I'm solving this exercise in Klenke's book:

Let $X_1,X_2, \dots $ be i.i.d. nonnegative random variables. By virtue of the Borel-Cantelli lemma, show that for every $c \in(0,1)$, $$\sum_{n=1}^\infty e^{X_n} c^n \begin{cases} < \infty \textrm{ a.s.} & \textrm{if } \mathbb E[X_1] < \infty; \\ = \infty \textrm{ a.s.} & \textrm{if } \mathbb E[X_1] = \infty \end{cases}$$

There are different ways to prove the statement using the Borel-Cantelli lemma (here's a thread with different answers: link) However I wanted to try a different approach. I defined $S_k := \sum_{n=1}^k e^{X_n} c^n $ which given the nonnegativity of its elements converges from below to $S:= \sum_{n=1}^\infty e^{X_n} c^n $ . We can prove using the 0-1 Law that $S=a$ almost surely where $ a \in [-\infty, \infty]$ is a constant. And now applying the monotone convergence theorem and taking the expectations delivers: $$ a=\mathbb{E}[S]=\sum_{n=1}^\infty \mathbb{E}[e^{X_n}] c^n =\mathbb{E}[e^{X_1}] \sum_{n=1}^\infty c^n $$ Which means that $a$ is finite iff $\mathbb{E}[e^{X_1}] < \infty$. This however is not equivalent to the statement in the exercise. Does anyone see where the argument fails?

2

There are 2 best solutions below

0
On BEST ANSWER

We have to distinguish the following two lemmata, both of which are a consequence of Kolmogorov's 0-1-law:

Lemma 1
Let $(X_{k})_{k \in \mathbb{N}}$ be a sequence of independent random variables and let $S_{n}:=\sum_{k=1}^{n}X_{k}$.
Then $\mathbb{P}(S_{n} \text{ converges}) \in \{0,1\}$.

Lemma 2
Any random variable $Y$ that is measurable with respect to the tail-sigma-field of such a sequence of independent random variables, is a.s constant.

To prove almost sure convergence, we could apply Kolmogorov's Three-Series Theorem, but that in itself is a consequence of Borel-Cantelli - so no shortcut here.

Finally, Kolmogorov's 0-1 law does not allow us to conclude that the limit $S=\lim S_{n}$ is constant if it indeed exists, since $S$ is not measurable with respect to the the terminal sigma-field.

0
On

I would argue that your approach proved that 0-1 law does not hold to show convergence to a constant in your case. I assume you used Kolmogorov's 0-1 law, though that does not hold here to show convergence to a constant, it only shows you that $S$ converges almost surely, since the tail event $S_k \rightarrow a$ very much depends on the value of $X_1$. (It might be clearer to write it like $c e^{X_1} + c^2 e^{X_2} + ... $), and the value of $X_1$ will influence the value of the limit.

Kolmogorov's 0-1 law here can only show that this sequence converges almost surely, since that indeed depends on the behavior of the tail.