Asymptotics of Brownian motion

45 Views Asked by At

Let $X$ be a Brownian motion. Then a corollary of the iterated logarithm law says that: $$\limsup_{s\rightarrow +\infty} \frac{X_s}{(2s\log\log s)^{1/2}} = 1 \quad a.s. \tag{3.6}$$ $$\liminf_{s\rightarrow +\infty} \frac{X_s}{(2s\log\log s)^{1/2}} = -1 \quad a.s.\\ \tag{3.7}$$

With help from an earlier question I posted, my interpretation of (3.6) and (3.7) is that as $t \rightarrow \infty$ a Brownian motion with the scaling $(2t\log\log t)^{-1/2}$ will almost surely take values within $(-1,1)$. This is only a statement about the $t \rightarrow \infty$ limit so for any finite time the Brownian motion may take values outside $(-1,1)$, but we expect that this will happen less often as $t$ increases.

What is confusing me is the following remark in the book Stochastic Calculus by Baldi

(3.6) and (3.7) give important information concerning the asymptotic of the Brownian motion as $t \rightarrow + \infty$. Indeed they imply the existence of two sequences of times $(t_n)_n, (s_n)_n$, with $\lim_{n\rightarrow \infty} t_n = \lim_{n\rightarrow \infty} s_n = +\infty$ such that $$X_{t_n} \geq (1-\epsilon)\sqrt{2t_n \log\log t_n}$$ $$X_{s_n} \leq -(1-\epsilon)\sqrt{2t_n \log\log s_n}.$$ This means that, as $t \rightarrow +\infty$, the Brownian motion takes arbitrary large positive and negative values infinitely many times. It therefore exhibits larger and larger oscillations. As the paths are continuous, in particular, it visits every real number infinitely many times.

(3.6) and (3.7) also give a bound on how fast a Brownian motion moves away from the origin. In particular, (3.6) implies that, for $t$ large, $X_t(\omega) \leq (1 + \epsilon) \sqrt{2t \log\log t}$. Similarly, by (3.7), for $t$ large, $X_t(\omega) \geq -(1+\epsilon)\sqrt{2t\log\log t}$, so that for large $t$, $$|X_t(\omega)| \leq (1+\epsilon)\sqrt{2t\log\log t}. \tag{3.8}$$ To be precise, there exists a $t_0 = t_0(\omega)$ such that (3.8) holds for every $t \geq t_0$.

How did he obtain the bounds for $X_{t_n}$ and $X_{s_n}$ in the passage (specificially the $(1-\epsilon)$, and how do they imply that the Brownian motion takes arbitrarily large positive and negative values infinitely often? My guess is the function $(2s\log\log s)^{-1/2}$ decays very rapidly, so in order for the quotient (3.6) to remain away from $0$ it must be that $X_s$ grows very rapidly, hence it takes arbitrarily large positive and negative values infinitely often. If this is the right idea, how does his argument make this rigorous?

Similarly for the second part, how does he obtain the $(1+\epsilon)$ factor and how does this imply the existence of the $t_0$ he describes?

1

There are 1 best solutions below

3
On BEST ANSWER

These facts follow from properties of $\limsup$ and $\liminf$:

We can rewrite (3.6) as (all the following identities and inequalities hold $\mathbb{P}$-almost surely) $$ \lim_{s \to \infty} \sup_{t \geq s} \frac{X_t}{(2t\log\log t)^{1/2}} = 1. $$ By definition of the limit, for every $\epsilon > 0$ (here we also assume $\epsilon < 1$) we can find $s_0 > 0$ such that for every $s_1 \geq s_0$ $$ \sup_{t \geq s_1} \frac{X_t}{(2t\log\log t)^{1/2}} \geq (1-\epsilon/2). \tag{1}$$ By a property of the supremum, there exists $t_1 \geq s_1$ such that $$ \frac{X_{t_1}}{(2t_1\log\log t_1)^{1/2}} \geq (1-\epsilon) \implies X_{t_1} \geq (1-\epsilon) (2t_1\log\log t_1)^{1/2}. $$ Since (1) remains true if we replace $s_1$ with any $s \geq s_1$, it holds in particular for $s_2 := t_1 \geq s_1$. Starting from this $s_2$ we can find $t_2 \geq s_2$ as above, etc... Continuing this procedure, we can construct an increasing sequence $(t_n)_{n \geq 1}$ satisfying the required inequality for every $n \geq 1$. The sequence $(s_n)_{n \geq 1}$ is constructed in the same way, reversing the inequalities and using the property of the infimum.

To see how this implies that the Brownian motion takes arbitrary large positive and negative values infinitely many times, let $M > 0$ be a positive constant. We want to show that the values taken by the Brownian motion exceed $M$ infinitely often. Since $t \mapsto (1-\epsilon)\sqrt{2t\log\log t}$ diverges to $+\infty$ as $t \to +\infty$, we have that eventually (for some $t^*$) it will be greater than $M$. Hence, we can find $n \geq 1$ such that $t_n \geq t^*$, so that $$X_{t_n} \geq (1-\epsilon)\sqrt{2t_n\log\log t_n} \geq M \tag{2},$$ since $t \mapsto (1-\epsilon)\sqrt{2t\log\log t}$ is increasing over its entire domain. By monotonicity, (2) will also hold for $t_m$ for every $m \geq n$. We thus conclude that the Brownian motion will exceed $M$ infinitely often.

For the last part, we use again the definition of the limit to find $s_0^+ > 0$ such that $$ \sup_{t \geq s_0} \frac{X_t}{(2t\log\log t)^{1/2}} \leq (1+\epsilon). \tag{3} $$ In particular (3) implies that for every $t \geq s_0^+$ $$ \frac{X_t}{(2t\log\log t)^{1/2}} \leq (1+\epsilon) \implies X_t \leq (1+\epsilon)(2t\log\log t)^{1/2}. \tag{4}$$ Similarly we can find $s_0^-$ such that for every $t \geq s_0^-$ $$ X_t \geq -(1+\epsilon)(2t\log\log t)^{1/2} \tag{5},$$ so that by letting $t_0 = \max{(s_0^+, s_0^-)}$ both (4) and (5) hold, and your (3.8) is thus verified.