I'm solving this exercise in Klenke's book:
Let $X_1,X_2, \dots $ be i.i.d. nonnegative random variables. By virtue of the Borel-Cantelli lemma, show that for every $c \in(0,1)$, $$\sum_{n=1}^\infty e^{X_n} c^n \begin{cases} < \infty \textrm{ a.s.} & \textrm{if } \mathbb E[X_1] < \infty; \\ = \infty \textrm{ a.s.} & \textrm{if } \mathbb E[X_1] = \infty \end{cases}$$
There are different ways to prove the statement using the Borel-Cantelli lemma (here's a thread with different answers: link) However I wanted to try a different approach. I defined $S_k := \sum_{n=1}^k e^{X_n} c^n $ which given the nonnegativity of its elements converges from below to $S:= \sum_{n=1}^\infty e^{X_n} c^n $ . We can prove using the 0-1 Law that $S=a$ almost surely where $ a \in [-\infty, \infty]$ is a constant. And now applying the monotone convergence theorem and taking the expectations delivers: $$ a=\mathbb{E}[S]=\sum_{n=1}^\infty \mathbb{E}[e^{X_n}] c^n =\mathbb{E}[e^{X_1}] \sum_{n=1}^\infty c^n $$ Which means that $a$ is finite iff $\mathbb{E}[e^{X_1}] < \infty$. This however is not equivalent to the statement in the exercise. Does anyone see where the argument fails?
We have to distinguish the following two lemmata, both of which are a consequence of Kolmogorov's 0-1-law:
Lemma 1
Let $(X_{k})_{k \in \mathbb{N}}$ be a sequence of independent random variables and let $S_{n}:=\sum_{k=1}^{n}X_{k}$.
Then $\mathbb{P}(S_{n} \text{ converges}) \in \{0,1\}$.
Lemma 2
Any random variable $Y$ that is measurable with respect to the tail-sigma-field of such a sequence of independent random variables, is a.s constant.
To prove almost sure convergence, we could apply Kolmogorov's Three-Series Theorem, but that in itself is a consequence of Borel-Cantelli - so no shortcut here.
Finally, Kolmogorov's 0-1 law does not allow us to conclude that the limit $S=\lim S_{n}$ is constant if it indeed exists, since $S$ is not measurable with respect to the the terminal sigma-field.