Law of Large Numbers for Martingales

4.7k Views Asked by At

The following question has me stumped: Let $X_n$ be a square integrable martingale with $E((X_n)^2)\leq n$ for all $n$. Prove that $X_n/n$ tends to $0$ almost surely. (this is in a sense a law of large numbers, generalizing the case where $X_n$ is a sum of $n$ iid zero mean random variables.)

Any ideas?

2

There are 2 best solutions below

0
On

We assume $X_0=0$ without loss of generality. Define $$Y_n:=\sum_{i=1}^n\frac{X_i-X_{i-1}}i, n\geqslant 1, \quad Y_0:=0.$$ Then $\left(Y_n\right)_{n\geqslant 1}$ is a martingale (for the same filtration as $\left(X_n\right)_{n\geqslant 1}$) and using the fact that $\left(X_i-X_{i-1}\right)_{i\geqslant 1}$ is a martingale differences sequence, we have
\begin{align} \mathbb E\left[Y_n^2\right]=\sum_{i=1}^n\frac 1{i^2}\left(\mathbb E\left[X_i^2\right]-\mathbb E\left[X_{i-1}^2\right]\right). \end{align} Now, using Abel's transformation and the assumption on $\mathbb E\left[X_i^2\right]$, we derive boundedness of the sequence $\left(\mathbb E\left[Y_n^2\right]\right)_{n\geqslant 1}$. Using the martingale convergence theorem, we get that the sequence $\left(Y_n\right)_{n\geqslant 1}$ converges almost surely to some random variable $Y$.

Now, we have (accounting $X_0=0$) \begin{align} \frac{X_n}n&=\frac 1n\sum_{l=1}^n\left(X_l-X_{l-1}\right)\\ &=\frac 1n\sum_{l=1}^n\frac{X_l-X_{l-1}}l\cdot l\\ &=\frac 1n\sum_{l=1}^n\left(Y_l-Y_{l-1}\right)\cdot l\\ &=\frac 1n\sum_{k=1}^nY_k\cdot k-\frac 1n\sum_{k=0}^{n-1}Y_k\cdot (k+1)\\ &=\frac 1n\sum_{k=1}^nY_k\cdot k-\frac 1n\sum_{k=1}^{n-1}Y_k\cdot (k+1)\\ &=Y_n-\frac 1n\sum_{k=1}^nY_k, \end{align} from which it follows that $X_n/n\to 0$ almost surely.

6
On

The claim in question is a corollary of a standard SLLN for martingale difference sequences (MDS).

SLLN for MDS

The statement of SLLN for MDS is as follows. Let $N_t$ be a martingale difference sequence (MDS) such that $\sum\limits_{t=1}^{\infty} \frac{E[N_t^2]}{t^2} < \infty$, then

$$ \frac{1}{n} \sum_{t=1}^n N_t \rightarrow 0 \;\;a.s. $$

(In this case, the martingale difference sequence $N_t$ is given by differencing the martingale $X_t$: $N_t = X_t - X_{t-1}$. Then summation by parts gives \begin{align*} \sum_{t=1}^n \frac{E[N_t^2]}{t^2} &= \sum_{t=1}^n \frac{E[X_t^2] - E[X_{t-1}^2]}{t^2} \\ &= \frac{E[X_n^2]}{n^2} - \sum_{t = 1}^{n} E[X_{t-1}^2] \left( \frac{1}{t^2} - \frac{1}{(t-1)^2} \right). \end{align*}

The assumption that $E[X_{t}^2] = O(t)$ implies that $$ E[X_{t-1}^2] ( \frac{1}{(t-1)^2} - \frac{1}{t^2} ) = O(\frac{1}{t^2}). $$ Therefore $\sum\limits_{t=1}^{\infty} \frac{E[N_t^2]}{t^2} < \infty$. )

In turn, the SLLN for MDS can be shown via two arguments. Both are standard devices for results of this type, one via the martingale convergence theorem and another via Kolmogorov's martingale maximal inequality.

Via Martingale Convergence Theorem

(The previous answer is a variation of this argument.)

If $\sum\limits_{t=1}^{\infty} \frac{E[N_t^2]}{t^2} < \infty$, the martingale $Y_n = \sum\limits_{t = 1}^n \frac{N_t}{t}$, $n \geq 1$, is bounded in $L^2$, therefore converges almost surely (and in $L^2$). Therefore, by Kronecker's lemma, $$ \frac{1}{n}\sum_{t = 1}^n N_t \stackrel{a.s.}{\rightarrow} 0 $$ as $n \rightarrow \infty$.

Via Maximal Inequality

Consider again the $L^2$-martingale $Y_n = \sum\limits_{t = 1}^n \frac{X_t}{t}$, $n \geq 1$. Let $\sigma^2_t = \frac{E[ X_t^2 ]}{t^2}$.

By the maximal inequality, for all $n > 0$ and for all $\epsilon > 0$, $$ P( \sup_{m \geq n} | S_m - S_n | \geq \epsilon ) \leq \frac{K}{\epsilon^2} \sum_{t \geq n} \sigma^2_t $$ for some constant $K$ independent of $n$. Therefore $$ P( \inf_n \sup_{m \geq n} | S_m - S_n | \geq \epsilon ) = 0 $$ for all $\epsilon > 0$. In other words, the sequence $S_n$, $n \geq 1$, is Cauchy, therefore converges, with probability $1$. Again by Kronecker's lemma, $$ \frac{1}{n}\sum_{t = 1}^n N_t $$ converges to zero as $n \rightarrow \infty$ with probability $1$.