What extra condition is needed so that $L^2$ convergence follows from almost sure convergence?

129 Views Asked by At

I was wondering, if I have a sequence of random variables $X_N$ with $$ X_N \to X \text{ a.s., } $$ (and the first few moments of the $X_N$ are uniformly bounded), what I would need to be able to get $$ E[(X_N-X)^2] \to 0. $$ One intuition I had was something like $$ X \in L^{2+\delta} $$ for some $\delta>0$, but is this true? If so how can I prove it?

2

There are 2 best solutions below

2
On BEST ANSWER

The explicit condition is that of uniform integrability. Uniform Integrability is the Generalization of the DCT condition which allows you to conclude $L^{1}$ convergence from convergence in Probability(and thus also almost sure convergence). In fact the DCT condition of being bounded by an integrable rv is a sufficient condition for Uniform Integrability

A collection $\{X_{i}\}_{i\in I}$ of random variables is said to be Uniformly Integrable if for each $\epsilon>0$, there exists an $M>0$ (depending on $\epsilon$ only) such that $\Bbb{E}(|X_{i}|\mathbf{1}_{\{|X_{i}|>M\}})<\epsilon$ for all $i\in I$ or equivalently $\sup_{i\in I}\Bbb{E}(|X_{i}|\mathbf{1}_{\{|X_{i}|>M\}})<\epsilon$. Here $I$ is an arbitrary indexing set.

I'll give some equivalent conditions and state them below:

The following is known as the Uniform Integrability Theorem and it allows you to conclude $L^{1}$ convergence from convergence in Probability

$X_{n}\xrightarrow{P} X$ and $\{X_{n}\}_{n}$ is uniformly integrable if and only if $X_{n}\xrightarrow{L^{1}}X $ .

For a proof see Theorem $13$ here and also in Wikipedia here if you only want the statement

(Note that you can even weaken the condition $X_{n}\xrightarrow{P} X$ by $X_{n}\xrightarrow{d} X$ due to the Skorokhod Representation Theorem)

You also always have that if

$X_{n}\xrightarrow{a.s.} X$ then $X_{n}\xrightarrow{L^{p}} X$ for $p\geq 1$ if and only if $\Bbb{E}(|X_{n}|^{p})\to \Bbb{E}(|X|^{p})$

This is a general result of $L^{p}$ spaces but not really useful in your case. See Thm $7$ Chapter $7$ of Royden Real analysis for a proof. (It is simple enough and is using the fact that $x\mapsto |x|^{p}$ is convex)

So ,the most general statement that you require is:-

$X_{n}\xrightarrow{L^{2}} X$ if and only if $X_{n}\xrightarrow{P} X$ and $\{X_{n}^{2}\}$ is uniformly integrable

One of the sufficient conditions for uniform integrability is $\{X_{n}\}$ is uniformly bounded in $L^{1+\delta}$ as you were on the right track of thinking . (For $L^{2}$ it would appropriately be scaled by $L^{2+\delta}$) .

What I mean is

$\{X_{n}^{2}\}$ is uniformly integrable if $\{X_{n}^{2}\}$ is uniformly bounded in $L^{1+\delta}$ or equivalently $\{|X_{n}|\}$ is uniformly bounded in $L^{2+2\delta}\equiv L^{2+\delta}$ .

The proof is very simple. Let $\{X_{n}\}$ be a collection of random variables such that $\sup_{n\geq 1}\Bbb{E}(|X_{n}|^{\delta+1})\leq C$ . Then we show that $\{X_{n}\}$ is uniformly integrable.

We have,

$$\Bbb{E}(|X_{n}|\mathbf{1}_{|X_{n}|>M})\leq \Bbb{E}(|X_{n}|\cdot\bigg(\frac{|X_{n}|}{M}\bigg)^{\delta}\mathbf{1}_{|X_{n}|>M})\leq\frac{1}{M^{\delta}}\Bbb{E}(|X_{n}|^{1+\delta})\leq \frac{C}{M^{\delta}} $$ and that's it.

You could have also directly use that $\Bbb{E}(|X_{n}^{2}|\mathbf{1}_{|X_{n}|>M})\leq \frac{1}{M^{\delta}}\mathbb{E}(|X_{n}|^{2+\delta})$ and conclude that $\{X_{n}^{2}\}$ is uniformly integrable if $|X_{n}|$ is uniformly bounded in $L^{2+\delta}$

To sum it up and answer your question:-

1.

If $X_{n}\xrightarrow{a.s.} X$ and $\sup_{n\geq 1}\Bbb{E}(|X_{n}|^{2+\delta})<\infty$ then $\{X_{n}^{2}\}$ is uniformly integrable and hence $X_{n}\xrightarrow{L^{2}} X$ i.e. $\Bbb{E}(|X_{n}-X|^{2})\to 0$

More generally,

2.

If $X_{n}\xrightarrow{P} X$ and $\{|X_{n}|^{p}\}$ is uniformly integrable then $\Bbb{E}(|X_{n}-X|^{p})\to 0$ .

Adendum: The page I had linked is an excellent resource on Uniform Integrability and I often use it as a suggestion and for reference. All proofs except the last theorem on the page are very concise . The last theorem though requires a bit more subtlety and I had asked the owner of the site to ammend to proof .

1
On

The uniform boundedness of the moments of order greater than $2$ suffices. The proof below shows the reasoning leading to the result where the role of $X_n-X$ is played by $f_n=|X_n-X|^{1/2}.$ $f_n\in L^2(0,1)$ satisfy $$\int\limits_0^1|f_n(x)|^{1+\delta}\,dx\le 1,\quad f_n\to 0,\ {\rm a.e.}$$ Then $$\int\limits_0^1|f_n(x)|\,dx\underset{n\to\infty}{\longrightarrow} 0$$ Indeed for fixed $\varepsilon>0$ and $k\ge \varepsilon^{-1/\delta}$ let $$A_{n,k}=\{x\in [0,1]\,:\, |f_n(x)|\ge k\},\quad B_{n,k}=[0,1]\setminus A_{n,k}$$ Then $$\int\limits_{B_{n,k}}|f_n(x)|\,dx \underset{n\to\infty}{\longrightarrow} 0$$ by the Lebesgue dominated convergence theorem. Therefore there exists $n_0$ such that $$\int\limits_{B_{n,k}}|f_n(x)|\,dx\le \varepsilon,\quad n\ge n_0$$ Furthermore $$\int\limits_0^1|f_n(x)|\,dx =\int\limits_{A_{n,k}}|f_n(x)|\,dx+\int\limits_{B_{n,k}}|f_n(x)|\,dx\\ \le \varepsilon+\int\limits_{A_{n,k}}|f_n(x)|\,dx= \varepsilon+{1\over k^{\delta}}\int\limits_{A_{n,k}}k^\delta|f_n(x)|\,dx\\ \le \varepsilon+{1\over k^\delta}\int\limits_{A_{n,k}}|f_n(x)|^{1+\delta}\,dx\le \varepsilon+{1\over k^\delta }\le 2\varepsilon$$