if $M$ is a UI - martingale then $M_t \rightarrow M_{\infty}$ in $L^1$

662 Views Asked by At

I'm trying to prove the following:

Let $M$ be a uniformly integrable martingale. Then there exists a random variable $M_{\infty}$ such that $M_t \rightarrow M_{\infty}$ in $L^1$.

This is what I have so far:

A UI martingale $M$ is clearly a $L^1$-martingale. Take, for example $\epsilon = 1$. Then, by definition (of UI-martingale), it exists $K_1$ such that $\sup_{t \geq 0} E|M_t|< 1+K_{1}.$ Hence, by the martingale convergence theorem, there exists $M_{\infty} \in L^1$ such that $M_t \rightarrow M_{\infty}$ a.s. Now, to show $E|M_t-M_{\infty}| \rightarrow 0$ as $t \rightarrow \infty$, I guess I have to use the dominated convergence theorem but I can't find any bound. If it was $L^2,$ I could use Doob's $L^p$-inequality to find the bound, but we are in $L^1,$ so I don't know how to continue. How can I finish the proof? Is there another way to prove it?

3

There are 3 best solutions below

0
On BEST ANSWER

From Rogers and Williams (1st Volume).

We will need the following two results:

Proposition 1. Suppose that $X \in L^1$. Let $\epsilon > 0.$ Then there exists $K$ such that $$E[|X|;|X|>K] < \epsilon.$$

Theorem 2. (Bounded-Convergence Theorem) Let $(X_n)$ be a sequence of of random variables, and let $X$ be a random variable. Suppose that $X_n \rightarrow X$ in probability and that, for some $K \in [0, \infty),$ we have for every $n$ and $\omega,$ that $|X_n(\omega)| \leq K.$ Then $$ E[ |X_n -X |] \rightarrow 0. $$

Answer to the question:

A UI martingale $M$ is clearly a $L^1$-martingale. Take, for example $\epsilon = 1$. Then, by definition (of UI-martingale), for all $t \geq 0,$ there exists $K_1$ such that $$E|M_t| = E[|M_t|;|M_t|>K_1] + E[|M_t|;|M_t| \leq K_1]= 1 + K_1.$$ Hence, $\sup_{t \geq 0}E|M_t| \leq 1+ K_1$ and $M$ is a $L^1$-martingale. By the martingale convergence theorem, there exists $M_{\infty} \in L^1$ such that $M_t \rightarrow M_{\infty}$ a.s., which implies that $M_t \rightarrow M_{\infty}$ in probability.

Next, for $K \in [0,\infty),$ define the functions $g_K: \mathbb R \rightarrow [-K,K]$ as follows: $$g_K(x):= \begin{cases} K \quad \text{ if } x>K; \\ x \quad \text{ if } |x| \leq K; \\ -K \quad \text{ if } x<K. \end{cases}$$

Now, using the family of functions $g_K,$ we will prove that $M_t \rightarrow M_\infty$ in $L^1$.

Let $\epsilon > 0$ and choose $K$ large enough so \begin{align*} E|g_K(M_t)-M_t| &< \frac{\epsilon}{3} \tag*{(since M is a UI-martingale)} \\ E|g_K(M_\infty)-M_\infty| &< \frac{\epsilon}{3} \tag*{(by Proposition 1)} \end{align*}

Moreover, note that the functions $g_K$ satisfy that for all $x,y \in \mathbb R,$ $|g_K(y)-g_K(x)| \leq |y-x|.$ Hence, given $K$ from the step before, we have that for all $t \geq 0$ $$|g_K(M_\infty)-g_K(M_t)| \leq |M_\infty-M_t|,$$ which implies that $$g_K(M_t) \rightarrow g_K(M_\infty) \text{ a.s. }$$ and also, $g_K(M_t) \rightarrow g_K(M_\infty)$ in probability. Hence, by Theorem 2, for large enough $t$ we have $E|g_K(M_\infty)-g_K(M_t)|< \frac{\epsilon}{3}.$ Therefore, by the triangular inequality \begin{align*} E|M_\infty - M_t| &= |M_t - g_K(M_t) + g_K(M_t) - g_K(M_\infty) + g_K(M_\infty) - M_\infty| \\ &\leq |M_t - g_K(M_t)| + |g_K(M_t) - g_K(M_\infty)| + |g_K(M_\infty) - M_\infty| \\ &< \epsilon. \end{align*}

10
On

Recall that if $X_n\to X$ in probability, then there exists a subsequence $\{n_k\}$ such that $X_{n_k}\to X$ a.s. For each positive integer $k$, we have that $\lim_{n\to\infty} \mathbb P(|X_n-X|>2^{-k})=0$. So for each $k$, we may find $n_k$ such that $\mathbb P(|X_{n_k}| > 2^{-k})\leqslant 2^{-k}$, and consequently $$ \sum_{k=1}^\infty \mathbb P(|X_{n_k}-X|>2^{-k})\leqslant \sum_{k=1}^\infty 2^{-k}<\infty. $$ Then by the Borel-Cantelli lemma, $$ \mathbb P\left(\limsup_{n\to\infty}\left\{|X_{n_k}-X|>2^{-k}\right\} \right) = 0, $$ from which it follows that $X_{n_k}\to X$ a.s.

Since $X_{n_k}\to X$ a.s. we have by Fatou's lemma $$ \mathbb E[|X|] = \mathbb E\left[\liminf_{k\to\infty}|X_{n_k}|\right]\leqslant \liminf_{k\to\infty} \mathbb E[|X_{n_k}|]. $$

A sequence of random variables $\{X_n\}$ is said to be uniformly integrable if $$\sup_n\lim_{K\to\infty} \mathbb E[|X_n|\mathsf 1_{\{|X_n|>K\}}] = 0.$$ This implies that $\sup_n\mathbb E[|X_n|]<\infty$. Now, we show that for every $\varepsilon>0$, there exists $\delta>0$ such that for any event $E$, $$ \mathbb P(E)<\delta\implies \sup_n\mathbb E[|X_n|\mathsf 1_E]<\varepsilon.\tag1 $$ Write $E_n = \{|X_n|>K\}$. Then $$ \mathbb E[|X_n|\mathsf 1_E = \mathbb E[|X_n|(\mathsf 1_{E\cap E_n}+\mathsf 1_{E\setminus E_n})] \leqslant \mathbb E[|X_n|\mathsf 1_{E_n}] + K\mathbb P(E). $$ Given $\varepsilon>0$, there exists $K>0$ such that $\sup_n\mathbb E[|X_n|\mathsf 1_{E_n}]<\frac\varepsilon2$. Setting $\delta=\frac\varepsilon{2K}$, we see that $(1)$ holds.

Now from $\mathbb E[|X|]\leqslant \liminf_{k\to\infty}\mathbb E[|X_{n_k}|]$ and $(1)$, we have that $\mathbb E[|X|]<\infty$, i.e. $X\in L^1$. The inequality $$ |X_n-X|^r \leqslant 2^r (|X_n|^r +|X|^r),\quad r>0 $$ shows that the sequence $\{|X_n-X|\}$ is uniformly integrable (check this!) and so for for each $\varepsilon>0$, \begin{align} \mathbb E[|X_n-X|] & = \mathbb E[|X_n-X|\mathsf 1_{\{|X_n-X|>\varepsilon\}}] + \mathbb E[|X_n-X|\mathsf 1_{\{|X_n-X|\leqslant\varepsilon\}}]\\ &\leqslant \mathbb E[|X_n-X|\mathsf 1_{\{|X_n-X|>\varepsilon\}}] + \varepsilon. \end{align} Since $\{|X_n-X|\}$ is uniformly integrable, $$ \lim_{n\to\infty} \mathbb E[|X_n-X|\mathsf 1_{\{|X_n-X|>\varepsilon\}}] = 0, $$ from which the result holds.

To answer @UBM's question, we have $$\sup_n \mathbb E[|X_n-X|] \leqslant 2( \sup_n\mathbb E[X_n] + \mathbb E[|X|)<\infty$$ using $r=1$. Pick $M>0$ such that $2( \sup_n\mathbb E[X_n] + \mathbb E[|X|)<M$. Now for each $\varepsilon>0$ we may pick $\delta<\frac\varepsilon M$ such that for any event $E$, $$\mathbb P(E)<\delta\implies \sup_n \mathbb E[|X_n-X|\mathsf 1_E]\leqslant M\mathbb P(E) < M\frac\varepsilon M = \varepsilon.$$ This implies that $\{|X_n-X|\}$ is uniformly integrable.

8
On

Truncate, using UI, to be able to use DCT. In more detail, given $\epsilon>0$ use the fact that $(M_n-M_\infty)$ is UI (why?) to choose $K$ so large that $E[|M_n-M_\infty|; |M_n-M_\infty|>K]<\epsilon$. By DCT and pointwise convergence, $\lim_nE[|M_n-M_\infty|; |M_n-M_\infty|\le K]=0$. Therefore $\limsup_nE|M_n-M_\infty|\le\epsilon$.