Conditions on formula of expectation of sum of infinitely random variables

49 Views Asked by At

My book states that it's not always true that:

$$E(\sum_{i=1}^{\infty} X_i) = \sum_{i=1}^{\infty} E(X_i)$$ and what makes it not true in general is this equality:

$$E(\sum_{i=1}^{\infty} X_i) = E(\lim_{n\to\infty}(\sum_{i=1}^{n}X_i)) \stackrel{?}{=} \lim_{n\to\infty}E(\sum_{i=1}^{n}X_i)$$

Two special cases that this can be true are:

  1. $X_i$ are all nonnegative
  2. $\sum_{i=1}^{\infty}E(|X_i|)$ is unbounded

I have no idea how to prove the relation in these two conditions. What am I missing here ?

1

There are 1 best solutions below

3
On

This has to do with the ability to swap an integral with infinite sum, which can happen under certain conditions. I think the main theorem you are missing is:

If $X_n:\Omega\to[0,\infty]$ are measurable functions for all $n\in \mathbb{N}$ and $X\left(x\right)=\sum_{n=1}^{\infty}X_{n}\left(x\right)$ for all $x\in X$ then: $$\intop_{\Omega}Xd\mu=\sum_{n=1}^{\infty}\intop_{\Omega}X_{n}d\mu$$

When you look at the definition of expectancy, you get the integral. As to the special cases you note:

  1. If $X_i$ are all non-negative, then the theorem holds.
  2. If the sum of positive elements is unbounded then when we swap order it is still unbounded, meaning $\infty=\infty$.

EDIT

  1. For the theorem I'm not sure if it has a name, but it appears in Rudin "Real and complex analysis" as Theorem 1.27.
  2. I can show the derivation from the theorem: $$ E(\sum_{n=1}^{\infty}X_n)=\intop_{-\infty}^{\infty}t\left ( \sum_{n=1}^{\infty} X_n(t)\right) dt=\intop_{-\infty}^{\infty} \sum_{n=1}^{\infty}t\cdot X_n(t) dt\overset{Theorem}{=}\sum_{n=1}^{\infty}\intop_{-\infty}^{\infty} t\cdot X_n(t)dt=\sum_{n=1}^{\infty}E(X_n) $$
  3. You can derive this from the definition of infinity at the limit, this has no "Real" meaning, since we cannot compare infinities. But if one side is unbounded, then so is the other, and vise versa.