Rigorous definition of convergence in mean

81 Views Asked by At

I'm looking at the wikipedia definition for Convergence in Mean, and I would appreciate clarification on one part.

It says, given a real number $r \ge 1$, we say that the sequence $X_n$ converges in the $r^{th}$ mean (or in the $L^r$-norm) towards the random variable $X$, if the r-th absolute moments $E(|X_n|^r )$ and $E(|X|^r )$ of $X_n$ and $X$ exist, and $\lim_{n \to \infty} E[|X_n-X|^r]=0$.

Question: When the expected value of a random variable exists (in this context) is it safe to assume that it's finite? In other words, is convergence in mean still well defined even if the expected value of the limiting random variable is infinite?

Side note, the reason why I'm asking is because I'm trying to prove that if a martingale converges in $L^1$ then it is uniformly integrable; however, I'm unsure if convergence in $L^1$ implies $\lim_{n \to \infty} E[|X_n|] = E[|X|]$ where $\underline{both}$ $E[|X_n|] < \infty$ and $E[|X|] < \infty$.

1

There are 1 best solutions below

1
On BEST ANSWER

When you say $X_n \to X$ in $L^{r}$ you assume that $E|X_n|^{r}$ and $E|X|^{r}$ are finite for all $n$.

If $(X_n)$ is a martinale then $E|X_n| <\infty$ for every $n$, by definition. When you say that the martingale converges in $L^{1}$ the limit $X$ does have finite first absolute moment: $E|X| <\infty$.