With probability one, the function $t \mapsto B_t$ is nowhere differentiable.

79 Views Asked by At

The context is that we have a theorem about Brownian motion which says that with probability one, the function $t \mapsto B_t$ is nowhere differentiable.

I am confused about what it means by a function $t \mapsto B_t$. It seems like it refers to the Brownian motion $B_t$, but to me it also seems like a function that sends $t$ to a function space, i.e. $$f: \mathbb R \to \text{a function space}$$. Am I right? How is the open set defined in this function space? $L^p$ norm? Could you help to explain that with a concrete example?

2

There are 2 best solutions below

5
On BEST ANSWER

Brownian motion $(B_t)_{t\geq 0}$ can be thought of as a collection of normal random variables $B_t\sim \mathcal{N}(0,t)$ in a probability space $(\Omega, \mathcal{F}, \mathbb{P})$ satisfying certain conditions like independent and stationary increments and almost surely continuous trajectories.

What do that last condition - and the event you asked about - mean?

Let $$A=\{\omega : t\mapsto B_t(\omega) \text{ is nowhere differentiable.} \}$$ be the event in question.

For every fixed $\omega$, $t\mapsto B_t(\omega)$ defines a trajectory, a function $B_{\bullet}(\omega):[0,\infty)\to \mathbb{R}$ like usual. For this function, it can be tested whether the claim is true or not.

Then the set of $\omega$ with non-differentiable trajectories will be an event for which we will be able to compute its probabiltiy, and it will turn out to be $1$.

As further intuition, when people graph Brownian motion they are graphing for a fixed $\omega$ one of said trajectories, which defines a curve on the line (or on space if it is a $d$-dimensional Brownian motion). The following is an example graph of a Brownian trajectory $t\mapsto B_t(\omega)$, where the $x$ axis is time $t$ and the $y$ axis is the position on the line:

Brownian Motion

The same tratment that event $A$ got applies for other events, which could provide other concrete examples as well - a first one is $$C=\{\omega : t\mapsto B_t(\omega) \text{ is continuous.}\}.$$

For a last concrete examples we may look at processes indexed in discrete time instead. If $(X_n)_n$ is a Markov process with state space $\mathbb{N}_0$ measuring a pile of coins' height, with transition probability $P$ with $P_{i,i+1}=\frac{1}{i+1}$ and $P_{i,0}=1-\frac{1}{i+1}$ for $i\in \mathbb{N}_0$ and initial state $0$, the event $$\{\omega : n\mapsto X_n(\omega) \text{is increasing}\}$$ can be thought of in a similar manner as well.

0
On

I found the following explanation helpful in helping me to understand the definition of Brownian motion.

There are two ways to think of $\omega$ in the definition of brownian motion $B_\omega(t)$. One is to think of $\omega$ as the Brownian motion path. A random experiment is performed, and its outcome is the path of the Brownian motion. Then $B_\omega(t)$ is the value of this path at time $t$, and this value of course depends on which path resulted from the random experiment. Alternatively, one can think of $\omega$ as something more primitive than the path itself, akin to the outcome of a sequence of coin tosses, although now the coin is being tossed "infinitely fast." Once the sequence of coin tosses has been performed and the result $\omega$ obtained, then the path of the Brownian motion can be drawn. If the tossing is done again and a different $\omega$ is obtained, then a different path will be drawn. (Steve Shreve, Stochastic Calculus for Finance II)

In the Original text, $B_\omega(t)$ is written as $W(t)$. For clearity, I changed $W(t)$ to $B_\omega(t)$.