Convergence of quadratic variations

473 Views Asked by At

This may be a fundamental question on a martingale theory.

Let $n \in \mathbb{N}$ and $M^n=(M^{n,1},\ldots,M^{n,d})$ be a $d$-dimensional square integrable martingale on a probability space with probability measure $P_n$. Each $M^n$ may not be continuous.

We denote by $E_n$ the expectation under $P_n$. We assume that for any $t \in [0,1]$ \begin{align*} \lim_{n \to \infty}E_n\left[\left|[M^{n,i},M^{n,j}]_t - \delta_{i,j}t\right|\right]=0. \end{align*} Then, can we show that the law of $M^n$ converges weakly to that of a $d$-dimensional Brownian motion in $D([0,1])$? Here, $D([0,1])$ is the space of right continuous functions with finite left limits with Skorohod topology.

This may be true. However, I do not the proof.

Please tell me a reference.

2

There are 2 best solutions below

2
On BEST ANSWER

For a sequence $ M^n=\{M_t^n, t\in [0,1] \} $ of square integrable martingales, the conclusion of $ M^n\overset{D([0,1])}{\longrightarrow}\mathrm{BM} $ from $ [M^n]_t\to t, t\in [0,1] $ may not be true. In p.476 of the book: Jacod, J. and A. N. Shiryayev, Limit Theory for Stochastic Processes, 2ed. Springer, 2003, there is an example to show the condition ($ [M^n]_t\to t, t\in [0,1] $) is not sufficient. Also in p.473 of same book, the Theorem 3.11 explains that if $ |\Delta M^n|\le K $, then the conlusion is OK. Generally, to guarantee $ M^n\overset{D([0,1])}{\longrightarrow}\mathrm{BM} $, further restriction on $|\Delta M^n|$ (similar to Lindeberg's condition) is nessesary.

0
On

Adding a bit to the previous answer, if one formulates the question in terms of the angle bracket (a more natural question in some contexts) instead of the square bracket, the statement is also not true. Not even when $M_n = M$ for all $n$ and $\langle M \rangle_t = t$.

A martingale $M$ with cadlag paths and quadratic variation $\langle M \rangle_t = t$ need not be a standard Brownian motion---take $M_t$ to be the compensated Poisson process $N_t - t$ where $N_t$ is Poisson process with intensity $1$. However, considering this counterexample would lead to a sufficient condition for when the statement holds, under weak convergence on $D[0,1]$.

The issue is whether $M$ has same finite dimensional distribution as Brownian motion, since Kolmogorov's continuity criterion gives a version with continuous sample paths once this is satisfied. Having the right first moment and quadratic variation/"infinitesmal second moment" is not enough to guarantee having the same finite dimensional distribution as Brownian motion.

Suppose $\langle M \rangle_t = t$ and that $M$ has independent increments---e.g. $M_t = N_t - t$. (The general martingale case is not that different; one needs to replace unconditional means by conditional means.)

$M$ has the same finite dimensional distribution as Brownian motion if and only if $$ \phi_{\lambda}(t) = E[e^{i \lambda M_t}] = e^{-\frac12 \lambda^2 t}, $$ or
$$ \frac{d}{dt} \phi_{\lambda}(t) = -\frac12 \lambda^2 \phi_{\lambda}(t), \; \phi_{\lambda}(0) = 1. $$ This ODE means that, for a small increment $\Delta_h = M_{t+h} - M_t$, we must have $$ E[e^{i \lambda \Delta_h}] = E[ 1 + i \lambda \Delta_h - \frac12 \lambda^2 \Delta_h^2 + \cdots ] = 1 - \frac12 \lambda^2 h + r(h), \quad \quad (*) $$ where the remainder term $r(h) = o(h)$, i.e. it must vanish faster than $h$. This is not true in general. E.g. for the process $N_t - t$, one can calculate directly $$ E[e^{i \lambda \Delta_h}] = e^{-\frac12 \lambda^2 h + O(h)} = 1 - \frac12 \lambda^2 h + O(h). $$ The remainder term is $O(h)$ and not $o(h)$, and $N_t - t$ does not have the same finite dimensional distribution as Brownian motion.

A standard sufficient condition that the remainder $r(h)$ in $(*)$ to be $o(h)$ is that, for all $t$, $$ \lim_{\alpha \rightarrow \infty} \limsup_{h \rightarrow 0} E[ \frac{ \Delta_h^2 }{h} \cdot 1_{ \{ \frac{ \Delta_h^2 }{h} > \alpha \} } ] = 0. \quad \quad (**) $$ The uniform integrability condition $(**)$ is a kind of an infinitesmal version of the Lindeberg condition for CLT's.

Couple Comments:

  1. Every local martingale $M$ with continuous paths and $\langle M \rangle_t = t$ must be a standard Brownian motion (Levy's Theorem). In the context of this discussion, this say that continuity of sample path guarantees that $r(h)$ in $(*)$ is $o(h)$---indeed, this is Ito's lemma for continuous local martingales. The requires the martingale property, namely that $\int d \langle M \rangle_t$ over an interval can be approximated by sum of $\Delta_h^2$'s as $h \rightarrow 0$ when paths are continuous.

  2. Quoting Ito's lemma again, $(*)$ can be recast in terms of the infinitesmal generator. A martingale is Brownian motion if and only if its infinitesmal generator is $\frac{d^2}{dx^2}$.

With a sequence $\{ M_n, n \geq 1\}$---first, there is no need or reason to assume each $M_n$ is a martingale. See, for example, various versions of Functional Central Limit Theorem where partial sums of dependent sequences (strong-mixing/mixingales/etc) converge weakly to the Brownian motion. The martingale condition only needs to hold "in the limit".

Second, tightness must be considered. Separate assumption needs to be made so that $\{ M_n, n \geq 1\}$ is tight on $D$. Then a standard sufficient condition that ensures that the weak limit is Brownian motion is again $(**)$, extended appropriately to sequences of random elements on $D[0,1]$. A detailed discussion can be found in Section 19 of Convergence of Probability Measures by Billingsley.