I'm stuck on the following problem.
Let $\xi_1,\xi_2,...$ be positive random variables defined on the same probability space. Suppose $\xi_n \rightarrow \xi$ in probability. If in addition, $\lim\limits_{n\rightarrow\infty}E[\xi_n] = E[\xi]$, then prove that $\xi_n \rightarrow \xi$ in $L^1$.
My approach was the following: Since $|x| = x^+ + x^-$ and $x = x^+ - x^-$,
$$ E[|\xi_n - \xi|] = E[(\xi_n - \xi)^+] + E[(\xi_n - \xi)^-] = E[\xi_n - \xi] + 2E[(\xi_n-\xi)^-]$$ The first term on the right will converge to $0$ by assumption. However, I'm not sure how to deal with the second term, $E[(\xi_n-\xi)^-]$. Would convergence in probability imply that this goes to $0$? For $\epsilon > 0$,
$$ E[(\xi_n-\xi)^-] = \int_\limits{\{|\xi_n-\xi| > \epsilon\}}(\xi_n-\xi)^-dP + \int_\limits{\{|\xi_n-\xi| \leq \epsilon\}}(\xi_n-\xi)^-dP \leq \int_\limits{\{|\xi_n-\xi| > \epsilon\}}(\xi_n-\xi)^-dP + \epsilon$$
I'm tempted to say that the integral converges to $0$ since $P(|\xi_n-\xi|>\epsilon) \rightarrow 0$, but I can't be certain.
You're basically there. We do have that $$\int_\limits{\{|\xi_n-\xi| > \epsilon\}}(\xi_n-\xi)^-dP \to 0$$ as $n \to \infty$. First we have the easy estimate $$\int_\limits{\{|\xi_n-\xi| > \epsilon\}}(\xi_n-\xi)^-dP \leq \int_\limits{\{|\xi_n-\xi| > \epsilon\}} \xi dP$$ since $\xi_n, \xi \geq 0$. So it suffices to show that the right hand integral goes to $0$ as $n \to \infty$. This follows from the slightly more general result:
To see this note that $$\int_A |X| dP = \int_{A \cap \{ |X| \leq N\}} |X| dP + \int_{A \cap \{|X| \geq N\}} |X| dP \leq NP(A) + \int_{\{|X| \geq N\}} |X| dP$$ Since $X$ is integrable, by taking $N$ large enough you can make the last integral as small as you want (say $< \varepsilon/2$). Then for $P(A) < \frac{\varepsilon}{2N}$ you have that the right hand side is less than $\varepsilon$.