Brownian motion mathematical model construction

81 Views Asked by At

I quote the construction of a mathematical model of Brownian Motion from Schilling-Partzsch



Consider a one-dimensional setting where a particle performs a random walk (notice that it can move to left or to right with equal probability $\dfrac{1}{2}$). We assume that each particle

  • starts at origin $x=0$;
  • changes its position only at discrete times $k\Delta t$ where $\Delta t>0$ is fixed and for all $k=1,2,\ldots$;
  • moves $\Delta x$ units to the left or to the right with equal probability;
  • $\Delta x$ does not depend on any past positions nor the current position $x$ nor on time $t=k\Delta t$.

Let us denote by $X_t$ the random position of the particle at time $t\in\left[0,T\right]$. During the time $\left[0, T\right]$, the particle has changed its position $\displaystyle\left\lfloor \frac{T}{\Delta t}\right\rfloor$ times. Since the decision to move left or right is random, we will model it by independent, identically distributed Bernoulli random variables $\varepsilon_k$, $k\geq 1$, where, as anticipated above $$\mathbb{P}\left(\varepsilon_1=1\right)=\mathbb{P}\left(\varepsilon_1=0\right)=\dfrac{1}{2}$$ so that $$S_N=\varepsilon_1+\ldots+\varepsilon_N\hspace{0.3cm}\text{and}\hspace{0.3cm} N-S_N$$ denote the number of right and left moves, respectively. We have that $$X_T=\left(X_t-X_t\right)+\left(X_t-X_0\right)=\sum\limits_{k=n+1}^{N}\left(2\varepsilon_k-1\right)\Delta x +\sum\limits_{k=1}^{n}\left(2\varepsilon_k-1\right)\Delta x$$ Considering that $\varepsilon_k$ are i.i.d. random variables, we have that $$X_T-X_t \sim X_{T-t}-X_0$$ We write $\sigma^2\left(t\right):= \mathbb{V} X_t$. By Bienaymé's identity we get $$\mathbb{V} X_T=\mathbb{V}\left(X_T-X_t\right)+\mathbb{V}\left(X_t-X_0\right)=\sigma^2\left(T-t\right)+\sigma^2\left(t\right)$$ which means that $t\mapsto \sigma^2(t)$ is linear: $$\mathbb{V} X_T=\sigma^2\left(T\right)=\sigma^2 T \tag{\(1\)}$$ Since $\mathbb{E}\varepsilon_1=\frac{1}{2}$ and $\mathbb{V} \varepsilon_1=\frac{1}{4}$ we get by a direct computation that $$\mathbb{V} X_T = N\left(\Delta x\right)^2=\frac{T}{\Delta t}\left(\Delta x\right)^2 \tag{\(2\)}$$ which reveals that $$\dfrac{\left(\Delta x\right)^2}{\Delta t}=\sigma^2=\text{constant}\tag{\(3\)}$$



As to the above quoted part, I have two doubts:

  1. As to the definition of variance of $X_t$, that is $\sigma^2\left(t\right):= \mathbb{V} X_t$, obviously I interpret this as if it were saying that $\sigma^2(t)$ is a function of $t$. At this point, given that by definition $\sigma^2\left(T\right):= \mathbb{V} X_T$ and also that $\mathbb{V} X_T=\mathbb{V}\left(X_T-X_t\right)+\mathbb{V}\left(X_t-X_0\right)=\sigma^2\left(T-t\right)+\sigma^2\left(t\right)$, I deduce that $\sigma^2(T)=\sigma^2\left(T-t\right)+\sigma^2\left(t\right)$. However, how can this led to second equality in $(1)$, that is $\sigma^2\left(T\right)=\sigma^2 T$? That is, why can one state that $\sigma^2\left(T\right)$ corresponds to $\sigma^2$ times $T$?
  2. I suppose that a direct computation gives equation $(2)$ only if one considers $\Delta x$ as a constant (and then compute the variance of $X_T=\sum\limits_{k=1}^N(2\varepsilon_k-1)\Delta x$). However, it appears to me that $\Delta x$ is not a constant, but, by definition, a random variable!
    Henceforth, how can a direct computation led to equation $(2)$?
1

There are 1 best solutions below

5
On BEST ANSWER
  1. Note that you only have natural times. Hence, you now have $$ \sigma^2(T)=\sigma^2(T-1)+\sigma^2(1)=\sigma^2(T-2)+\sigma^2(1)+\sigma^2(1)=\dots=T\sigma^2(1) $$ This is just a general proof that additive functions on $\mathbb{N}$ always extend to linear functions on $\mathbb{R}$.
  1. $\Delta x$ is not a random variable, it's the length of a single jump of a particle. The random part is whether the particle moves left or right.