Is this definition of Brownian motion correct?

102 Views Asked by At

I got a definition of BM as

Definition 2.3. The Brownian motion is a continuous time stochastic process $\{W(t), t \geq 0\}$ that satisfies the following conditions:

  • (i) $W(0)=0$ a.s.;
  • (ii) the paths $t \longmapsto W(t)$ are continuous a.s.;
  • (iii) for $0 \leq s<t<\infty$, the increment $W(t)-W(s)$ is independent of $W(s)$;
  • (iv) for $0 \leq s<t<\infty$, the increment $W(t)-W(s)$ has the normal distribution with mean 0 and variance $t-s$.

and then a proposition

Proposition 2.2. For any $0=t_{0} \leq t_{1} \leq \ldots \leq t_{n}$ the increments $$W\left(t_{1}\right)-W\left(t_{0}\right), \ldots, W\left(t_{n}\right)-W\left(t_{n-1}\right)$$ are independent random variables.

I could not prove this proposition from part (iii) of the definition. From other sources, the part (iii) of this definition is quite non-standard. Could you confirm if this definition is correct?


Update 1: It seems I found how to prove Prop 2.2. Could you confirm if my proof is correct?

It suffices to show $W(t_4) - W(t_3) \perp W(t_2) - W(t_1)$. We have $W(t_4) - W(t_2) \perp W(t_2)$ and $W(t_3) - W(t_2) \perp W(t_2)$ by definition. Then $W(t_4) - W(t_3) \perp W(t_2)$. Similarly, we get $W(t_4) - W(t_3) \perp W(t_1)$ and thus $W(t_4) - W(t_3) \perp -W(t_1)$. Finally, $W(t_4) - W(t_3) \perp W(t_2) - W(t_1)$.

1

There are 1 best solutions below

2
On

EDIT: this proof is valid, if $W(t)$ is defined to be Gaussian stochastic process, i.e. having jointly Gaussian finite-dimensional distributions. There are some steps in proving this proposition.

  1. From (iv) we deduce that $\mathbb{E}W(t) = 0$: as $W(0) = 0$ a.s., $W(t) \sim N(\mu = 0, \sigma^2 = t)$.
  2. Let $C(s, t) = \mathbb{E}\left(W(s) - \mathbb{E}W(s)\right)\mathbb{E}\left(W(t) - \mathbb{E}W(t)\right)$ denote the covariance function. From 1, it is equal to $\mathbb{E} W(s)W(t)$. As $W(t) - W(s) \perp W(s)$ for $0 \leq s < t$, $$0 = \mathbb{E}\left(W(t) - W(s)\right)W(s) = \mathbb{E} W(s)W(t) - \mathbb{E} W(s)W(s) = C(s,t) - s \Rightarrow C(s,t) = s$$ since $\mathbb{E} W(s)W(s)$ is the variance of $W(s) \sim N(\mu=0,\sigma^2=s)$. Similarly, for $0 \leq t < s$ we have $W(s) - W(t) \perp W(t)$ and $C(s,t) = t$. So, it leads to $C(s, t) = \min\{s, t\}$.
  3. As $W(t_1) - W(t_0), \dots, W(t_n) - W(t_{n-1})$ are all Gaussian, their independence is equivalent to being uncorrelated, which can be proven the following way: let $t_{i-1} < t_i < t_{j-1} < t_j$, then $$ cov\left(W(t_i)-W(t_{i-1}),W(t_j) - W(t_{j-1})\right) = cov(W(t_i),W(t_j)) - cov(W(t_i),W(t_{j-1})) - cov(W(t_{i-1}),W(t_{j})) + cov(W(t_{i-1}),W(t_{j-1})) = C(t_i,t_j) - C(t_i,t_{j-1}) - C(t_{i-1},t_j) + C(t_{i-1},t_{j-1}) = \min\{t_i,t_j\} - \min\{t_i,t_{j-1}\} - \min\{t_{i-1},t_j\} + \min\{t_{i-1},t_{j-1}\} = t_i - t_i - t_{i-1} + t_{i-1} = 0 $$