My book states that it's not always true that:
$$E(\sum_{i=1}^{\infty} X_i) = \sum_{i=1}^{\infty} E(X_i)$$ and what makes it not true in general is this equality:
$$E(\sum_{i=1}^{\infty} X_i) = E(\lim_{n\to\infty}(\sum_{i=1}^{n}X_i)) \stackrel{?}{=} \lim_{n\to\infty}E(\sum_{i=1}^{n}X_i)$$
Two special cases that this can be true are:
- $X_i$ are all nonnegative
- $\sum_{i=1}^{\infty}E(|X_i|)$ is unbounded
I have no idea how to prove the relation in these two conditions. What am I missing here ?
This has to do with the ability to swap an integral with infinite sum, which can happen under certain conditions. I think the main theorem you are missing is:
When you look at the definition of expectancy, you get the integral. As to the special cases you note:
EDIT