L1 convergence from convergence of the means

187 Views Asked by At

I want to prove the following result, but I can't get past a specific step and I'm not sure if I also need uniform integrability.

Suppose $X_n, X$ are nonnegative random variables with $E(X_n) \rightarrow E(X) < \infty$ and $P(X-X_n > \epsilon) \rightarrow 0$ for every $\epsilon > 0$. Then $E(|X_n-X|) \rightarrow 0$


My attempt:

Note that for large $n, X_n \in L^1$ so our expression is well defined.

$$E(|X - X_n|) = E((X - X_n)^+) - \underbrace{(E(X) - E(X_n))}_{\rightarrow 0}$$ so it sufficies to show that $E((X - X_n)^+) \rightarrow 0$. The second statement we are allowed to assume in the question implies $(X-X_n)^+ \rightarrow 0$ in probability, so if we had uniform integrability of the $(X-X_n)^+$ random variables we would be done. I can't proceed from here, so I'd appreciate any help.

1

There are 1 best solutions below

1
On BEST ANSWER

You did most of the work, as you note (albeit with a missing factor of 2), we have:

$$E(|X - X_n|) = 2 E((X - X_n)^+) - \underbrace{(E(X) - E(X_n))}_{\rightarrow 0}$$

so, now, we just need to show that $E((X - X_n)^+) \rightarrow 0$. We have that $P(X > t) \geq P(X - X_n > t)$ for all $t$. Since both $(X - X_n)^+$ and $X$ are non-negative, and are integrable (the first one is dominated pointwise by second), we use the tail sum formula:

$$E((X - X_n)^+) = \lim_{n \rightarrow \infty} \int_{t > 0} P(X - X_n > t) dt = \int_{t > 0} \lim_{n \rightarrow \infty} P(X - X_n > t) dt = 0$$

where we use DCT for the second equality (noting that $P(X > t)$ is our integrable function from the tail sum of $X$)