On the calculation of mean square limit for the Ito integral

535 Views Asked by At

Most handouts and books on Ito calculus show a simple way to solve the integrals shown below without making use of Ito's lemma \begin{equation} \int_{t_0}^t\mathrm{d}W(s)\\ \int_{t_0}^t W(s)\mathrm{d}W(s) \end{equation} C. Gardiner in his book Stochastic Methods. A Handbook for the Natural and Social Sciences (chapter 4) uses the so-called "mean-square limit". Let us take \begin{equation} S:=\int_{t_0}^t W(s)\mathrm{d}W(s) \end{equation} Diving the interval $[t_0,t]$ into $n$ equally spaced subintervals, the partial sums $S_n$ of $S$ are defined to be: \begin{equation} S_n=\sum_{j=1}^n W(s_{j-1})\left(W(s_j)-W(s_{j-1})\right) \qquad n\in \mathbb N \end{equation} Well, $S$ is the mean-square limit of $S_n$ \begin{equation} S=\underset{n\to\infty}{\operatorname{ms-lim}}\,S_n \end{equation} in the sense it satisfies \begin{equation}\tag{1} \lim_{n\to\infty}\langle (S_n - S)^2\rangle =0 \end{equation}

I'm sorry, but $(1)$ doesn't seem very useful; it defines more a property of $S$ rather than a procedure to actually find it. Isn't there a working definition for $S$ with partial sums?

1

There are 1 best solutions below

8
On BEST ANSWER

In the book you mentioned, there were some steps before the author jumped to the limit :

  • Rewriting $S_n$: The author wrote $$ S_n=\frac{1}{2}\left[W_t^2-W_{t_0}^2\right]-\frac{1}{2}\sum_{i=1}^n(\Delta W_i)^2\hspace{1cm}(1) $$ It's logical to say that $S_n$ now depends only on the second part, since $\frac{1}{2}\left[W_t^2-W_{t_0}^2\right]$ is independent from $n$.
  • Evaluating the mean square limit of the term : $$\left\langle\sum_{i=1}^n(\Delta W_i)^2\right\rangle=(t-t_0)$$ Since you're applying the mean square limit, your sequence should converge in the ''mean square'' sense, that is, it's not enough to just calculate the mean square, you should also prove it converges to it too.

In the book there are two typos. The first one is that instead of substracting $(t-t_0)^2$ you should instead substract $(t-t_0)$ in $(4.2.16)$.

We get that
$$\left\langle\left[\sum_{i=1}^n(\Delta W_i)^2-(t-t_0)\right]^2\right\rangle.$$

  • Using some elementary properties of the Brownian motion $(W_t)_{t\geq 0}$ and Guassian random variables $(W_t\sim\mathcal N(0,t),\forall t\geq 0)$ such as :
  1. $\mathbb E[(W_t-W_s)^2]=(t-s)$
  2. $\mathbb E[X^4]=3t^2$

The second typo is in the beginning of $(4.2.19)$, the value $(W_{i}-W_{i-1})$ must be squared. We get then : $$ \begin{align*} \left\langle\left[\sum_{i=1}^n(\Delta W_i)^2-(t-t_0)\right]^2\right\rangle&=3\sum_{i=1}^n(t_i-t_{i-1})^2+2\sum_{i\neq j}(t_i-t_{i-1})(t_j-t_{j-1})\\ &-2(t-t_0)\sum_{i=1}^n(t_i-t_{i-1})+(t-t_0)^2\\ &=2\sum_{i=1}^n(t_i-t_{i-1})^2+\sum_{i,j}[(t_i-t_{i-1})-(t-t_0)][(t_j-t_{j-1}-(t-t_0)]\\ &=2\sum_{i=1}^n(t_i-t_{i-1})^2\longrightarrow 0 \text{ as } n\rightarrow +\infty \end{align*}.$$

  • Finally, the mean square convergence provides the convergence in distribution : $$ \begin{align*} \int_{t_0}^t W_s dW_s&=\lim_{n\rightarrow +\infty}\left(S_n\right)\\ &=\lim_{n\rightarrow +\infty}\left(\frac{1}{2}\left[W_t^2-W_{t_0}^2\right]-\frac{1}{2}\sum_{i=1}^n(\Delta W_i)^2\right)\\ &=\frac{1}{2}\left[W_t^2-W_{t_0}^2\right]-\frac{1}{2}\lim_{n\rightarrow +\infty}\left(\sum_{i=1}^n(\Delta W_i)^2\right)\\ &=\frac{1}{2}\left[W_t^2-W_{t_0}^2\right]-\frac{1}{2}(t-t_0)\\ &=\frac{1}{2}\left[W_t^2-W_{t_0}^2-(t-t_0)\right]\\ \end{align*} $$ The Itô approach is far easier than direct calculations, since the author himself said ''An exact calculation is possible''.

  • EDIT : A random process/sequence $(X_n)_{n\geq 0}$ is said to be converging to a random variable $X$ in mean-square convergence (Or in $L^2$), iff $$\lim_{n\rightarrow \infty}\mathbb E\left [|X_n-X|^2\right]=0$$ While rewriting the integral $S$ in the classical definition, we are left with $\frac{1}{2}\sum_{i=1}^n(\Delta W_i)^2$ which can only be manipulated when evaluating its convergence in $L^2$ to $(t-t_0)$. This convergence allows us to say safely that $$ \lim_{n\rightarrow \infty}\sum_{i=1}^n(\Delta W_i)^2=(t-t_0)$$ Notice here the sense of convergence is in distribution (simple convergence).

The mean-square limit is a tool used ''mostly'' to check whether a random sequence converges to some random variable in $L^2$. It is not easy (in most cases) to determine the variable $X$ to which the sequence converges in $L^2$. (Though some exercises give you the random variable, you just need to check).

In this example of evaluating the integral $S$, the mean-square limit was useful in determining the convergence of the last part of $S_n$.