Convergence of a series of not necessarily independent random variables

83 Views Asked by At

I have been trying to solve the following problem:

Let $X_1, \ldots $ be a sequence of random variables with $\sum_n \mathbb{E}|X_n| < \infty$. Show that $X:= \sum_n X_n$ exists almost surely and that $E(X) = \sum_n \mathbb{E}X_n$.

I am aware that a powerful tool at one's disposal in such cases is Kolmogorov's three series theorem, whose conditions can be easily checked for this problem, but this does not seem applicable here since independence is missing. If there is a direct argument that can handle this problem, could you please lend me your insights?

Thank you.

1

There are 1 best solutions below

2
On BEST ANSWER

Recall that a random variable is a (measurable) function.

We are given

  • a probability space $(\Omega, \mathcal{A}, \mu)$ (I know, probabilists tend to use $P$ for the probability measure, I use $\mu$.)
  • a sequence $(X_n)$ of measurable functions on that probability space (probably real-valued, but it would work the same with complex-valued or $\mathbb{R}^d$-valued functions)
  • and $Y := \sum_n \lvert X_n\rvert$ is integrable (that's what $\sum \mathbb{E}\lvert X_n\rvert < +\infty$ says).

Since $Y$ is integrable, the set $$N := \biggl\{ \omega \in \Omega : \sum_{n = 1}^{\infty} \lvert X_n(\omega)\rvert = +\infty\biggr\}$$ is a $\mu$-null set.

On $\Omega \setminus N$ the series $\sum_{n = 1}^{\infty} X_n(\omega)$ converges absolutely. Defining $X$ arbitrarily on $N$ (assuming $\mu$ is a complete measure) we get another measurable function on $\Omega$, and since $\lvert X(\omega)\rvert \leqslant Y(\omega)$ for all $\omega \in \Omega$ it follows that $X \in L^1(\mu)$, and $$\mathbb{E}[X] = \int_{\Omega} X(\omega)\,d\mu = \int_{\Omega} \sum_{n = 1}^{\infty} X_n(\omega)\,d\mu = \sum_{n = 1}^{\infty} \int_{\Omega} X_n(\omega)\,d\mu = \sum_{n = 1}^{\infty} \mathbb{E}[X_n]$$ follows by the dominated convergence theorem.