Simple Question about Monotone Convergence Theorem

662 Views Asked by At

Suppose we have a sequence of (discrete) random variables $X_0, X_1, \dotsc$ over $E$ and $A \subseteq E$. Let $Y$ be some other random variable. Moreover, let $Z$ be a random variable with values in $\{0,1,\dotsc\}$

Is it always true that

$$\mathbb{E}[\sum_{n \geq 0} 1(X_n \in A)]= \sum_{n \geq 0 } \mathbb{P}[X_n \in A] \quad (1)$$

for example using the Monotone Convergence Theorem?

Is it always true that

$$\mathbb{E}[Y \sum_{n \geq 0} 1(X_n \in A)]= \sum_{n \geq 0 } \mathbb{E}[Y \cdot 1(X_n \in A)]\quad (2)$$ for example uing the Montone convergence theorem or do we use boundedness of $Y$?

Is it always true that $$\mathbb{E}[Y \cdot 1(Z < \infty)]=\sum_{n=0}^{\infty} \mathbb{E}[Y \cdot 1(Z=n)] \quad (3)$$ or is this only valid using Dominated Convergence, when $Y$ is bounded?

For non-negative random variables $U_i \geq 0$ for $i=0,1,\dotsc$ is it always true that $$\mathbb{E}[\sum_{i=0}^{\infty} U_i] = \sum_{i=0}^{\infty} \mathbb{E}[U_i] \quad (4)$$

by the monotone convergence theorem? Thank you very much for your help.

1

There are 1 best solutions below

7
On BEST ANSWER

I think you want $X_n$ in your sums instead of $X_0$. Anyway:

Case 1: Yes, monotone convergence is sufficient to prove it.

Case 2: If $Y$ takes on one sign, or more generally is bounded in one direction or another, then monotone convergence will give you this. In general this could fail. The most general but still practical way to check this that I can think of would be to check whether $Y$ has finite variance and the sum converges in mean square. (In particular the sum converges in mean square if it is bounded a.s., because of monotonicity.) If both of those hold then you get what you want.

Case 3: I think the situation is the same here as in case 2.