Law of Large Numbers for a Brownian Motion

377 Views Asked by At

I am self-learning introductory stochastic calculus from A first course in Stochastic Calculus by L.P.Arguin.

The part(c) of the below exercise problem on the time-inversion property of Brownian motion asks to derive the law of large numbers using the previously developed results. I struggled to write a proof of this.

I would like to ask, if my upper bounds and convergence for part (c) make sense, and is technically correct and rigorous.

I reproduce my solution to parts (a) and (b) for completeness.

Time Inversion. Let $(B_{t},t\geq0)$ be a standard brownian motion. We consider the process:

\begin{align*} X_{t} & =tB_{1/t}\quad\text{for }t>0 \end{align*}

This property relates the behavior of $t$ large to the behavior of $t$ small.

(a) Show that $(X_{t},t>0)$ has the distribution of Brownian motion on $t>0$.

Proof.

Like $B(t)$, it is easy to show that $X(t)$ is also a Gaussian process.

Also, $\mathbb{E}[X_{s}]=0$.

Let $s<t$. We have:

\begin{align*} Cov(X_{s},X_{t}) & =\mathbb{E}[sB(1/s)\cdot tB(1/t)]\\ & =st\mathbb{E}[B(1/s)\cdot B(1/t)]\\ & =st\cdot\frac{1}{t}\\ & \quad\left\{ \because\frac{1}{t}<\frac{1}{s}\right\} \\ & =s \end{align*}

Consequently, $X(t)$ has the distribution of a Brownian motion.

(b) Argue that $X(t)$ converges to $0$ as $t\to0$ in the sense of $L^{2}$-convergence. It is possible to show convergence almost surely so that $(X_{t},t\geq0)$ is really a Brownian motion for $t\geq0$.

Solution.

Let $(t_{n})$ be any arbitrary sequence of positive real numbers approaching $0$ and consider the sequence of random variables $(X(t_{n}))_{n=1}^{\infty}$. We have:

\begin{align*} \mathbb{E}\left[X(t_{n})^{2}\right] & =\mathbb{E}\left[t_{n}^{2}B(1/t_{n})^{2}\right]\\ & =t_{n}^{2}\mathbb{E}\left[B(1/t_{n})^{2}\right]\\ & =t_{n}^{2}\cdot\frac{1}{t_{n}}\\ & =t_{n} \end{align*}

Hence,

\begin{align*} \lim\mathbb{E}\left[X(t_{n})^{2}\right] & =\lim t_{n}=0 \end{align*}

Since $(t_{n})$ was an arbitrary sequence, it follows that $\lim_{t\to0}\mathbb{E}[(X(t))^{2}]=0$.

(c) Use this property of Brownian motion to show the law of large numbers for Brownian motion: \begin{align*} \lim_{t\to\infty}\frac{X(t)}{t} & =0\quad\text{almost surely} \end{align*}

Proof Sketch.

Let $(t_n)$ be an arbitrary sequence such that $(t_n)\to \infty$. Thus, $\forall n \in \mathbf{N}$, $\exists t_{k_n}$, such that $t_{k_n} > n$.

Consider the sequence of random variables $X_n := X(t_n)$. Let $\epsilon$ be arbitrary. We have:

\begin{align*} \mathbf{P}\left(\left|\frac{X(t_n)}{t_n}\right|>\epsilon\right) &= \mathbf{P}\left[\left(\frac{X(t_n)}{t_n}\right)^4>\epsilon^4\right]\\ &= \mathbf{P}[X(t_n)^4 > t_n^4 \epsilon^4]\\ &\leq \frac{1}{t_n^4 \epsilon^4} \mathbf{E}[X(t_n)^4]\\ & \quad \left\{ \text{ Chebyshev's inequality }\right\} \\ &= \frac{1}{t_n^4 \epsilon^4} \cdot 3t_n^2 \\ & \quad \left\{ \text{ Fourth moment of a standard brownian motion }\right\} \\ &= \frac{3}{\epsilon^4} \cdot \frac{1}{t_n^2} \\ &\leq \frac{3}{\epsilon^4} \cdot \frac{1}{t_{k_n}^2} \\ &\leq \frac{3}{\epsilon^4} \cdot \frac{1}{n^2} \\ \end{align*}

Since $\sum \frac{1}{n^2}$ is a convergent series, by the comparison test $\sum_{n=1}^{\infty} \mathbf{P}\left(\left|\frac{X(t_n)}{t_n}\right|>\epsilon\right)$ converges.

We know that, if $(\forall \epsilon>0)$, $\sum_{n=1}^{\infty} \mathbf{P}(|X_n - X| > \epsilon) < \infty$, then $X_n \to X$ almost surely.

Consequently, $\lim_{t_n \to \infty} \frac{X(t_n)}{t_n} = 0$ almost surely. Since, $(t_n)$ was an arbitrary sequence, $\lim_{t \to \infty} \frac{X(t)}{t} = 0$ almost surely.

1

There are 1 best solutions below

3
On BEST ANSWER

What you need to do is show that $X_{t}\to 0$ as $t\to 0$ almost surely . That would show that $\frac{B_{1/t}}{1/t}\to 0$ as $t\to 0$ almost surely which is same as showing $\frac{B_{t}}{t}\to 0$ as $t\to\infty$ which is the law of large numbers for brownian motion.

What you have done is show that $E(X(t)^{2})\to 0$ as $t\to 0$ which shows convergence in the $L^2$ sense and hence convergence in Probability. That is infact the Weak Law of Large Numbers. That is $\frac{B_{t}}{t}\xrightarrow{P} 0$ as $t\to\infty$.

To show almost sure convergence, you have to argue that $X(t)\to 0$ as $t\to 0$ which cannot be done by just using the L^2 convergence by using subsequential arguments. That is a separate proof itself which is done to show that $X_{t}$ is indeed a Brownian Motion. To do that you do have to show continuity of sample paths almost surely.

For $t>0$ , continuity is clear. However, it is the proof that as $t\to 0$, you have $X(t)\to 0$ almost surely is the main step in the proof which you have not done. You have just shown convergence in $L^2$ and hence in probability which is NOT equivalent to convergence almost surely.

I am posting a proof, which I like the most from Rene Schillings Brownian Motion book.

Note that $X(t)\to 0$ as $t\to 0$ if and only if for all $n\geq 1$ , there exists $m\geq 1$ such that for all $r\in \mathbb{Q}\cap (0,\frac{1}{m}]$ , you have $X(r)=|rB(1/r)|\to \frac{1}{n}$.

(To understand the above, recall the $\epsilon-\delta$ definition of continuity. Note that $\frac{1}{n}$ works as $\epsilon$ and $\frac{1}{m}$ works as $\delta$).

That is, $$\{X(t)\to 0\} = \cap_{n\geq 1}\cup_{m\geq 1}\cap_{r\in\mathbb{Q}\cap (0,\frac{1}{m}]}\{X(r)\leq\frac{1}{n}\}$$

But the RHS will have the same probability as $\cap_{n\geq 1}\cup_{m\geq 1}\cap_{r\in\mathbb{Q}\cap (0,\frac{1}{m}]}\{W(t)\leq\frac{1}{n}\}$ where $W$ denotes a standard brownian motion. This is because $X_{t}$ and $W_{t}$ have the same law, i.e. the same finite dimensional distributions and the union/intersections on the RHS was all countable.

Thus $P(\{X(t)\to 0\,,t\to 0\})=P(\{W(t)\to 0\,,t\to 0\}) = 1$ and thats it.

This actually shows that $X(t)$ is a bonafide Standard Brownian Motion as we have established continuity as well.

There are many other ways of proving this. For example see here . A standard proof goes by the Doob's Maximal inequality. See this note

Actually, time inversion and law of large numbers are the two sides of the same coin. One implies and is implied by the other.