$L^2$ convergence for variance of a Brownian motion

272 Views Asked by At

The Problem Given $(B_s,s \in [0,t])$ a Brownian motion on $[0,t]$ and $0=t_0 < t_1 < ... < t_n \le t$ a partition of $[0,t]$ such that $\max_{j}|t_{j+1} - t_j| \to 0$ as $n \to \infty$. Show that the random variable $$\sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2$$ converges to $t$ in $L^2$ as $n \to \infty$.

I am taking this class in stochastic calculus and I am very dry on probability theory. However, I think I understand key concepts of Brownian Motion, so I need some help putting together the pieces. I know I want to show that $$\lim_{n \to \infty} \left\Vert \sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2- t \right\Vert_2 = \lim_{n \to \infty} \mathbb{E}\left[ \left| \sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2- t \right|^2 \right]=0$$

I also realize that $$\mathbb{E} \left[ \sum_{j=0}^n (B_{t_j} - B_{t_{j-1}})^2 \right ]= \sum_{j=0}^n \mathbb{E} \left[ (B_{t_j} - B_{t_{j-1}})^2 \right ]$$ $$=\sum_{j=0}^n t_j - t_{j-1}=t$$ by the property of the variance of Brownian motion and because the sum of the $t_i$ is $t$. I guess by biggest problem is I don't know how to mathematically break into this $L^2$ norm. Any help is greatly appreciated.

1

There are 1 best solutions below

10
On BEST ANSWER

Hint

In your partition, you took $t_n\leq t$, but you must take $t_n=t$. Then,

\begin{align*} &\mathbb E\left[\left(\sum_{j=0}^{n-1}(B_{t_{j+1}}-B_{t_j})^2-t\right)^2\right]\\ &=\mathbb E\left[\left(\sum_{j=1}^{n-1}(B_{t_{j+1}}-B_{t_j})^2-(t_{j+1}-t_j)\right)^2\right]\\ &=\sum_{i=0}^{n-1}\sum_{j=0}^{n-1}\mathbb E\left[\Big((B_{t_{j+1}}-B_{t_j})^2-(t_{j+1}-t_j)\Big)\Big((B_{t_{i+1}}-B_{t_i})^2-(t_{i+1}-t_i)\Big)\right]. \end{align*}

I let you continue.