A simple characterization of the Brownian Motion

564 Views Asked by At

A well-known characterization of the Brownian Motion says that it is the only continuous process $X_t$ (defined on $[0,\infty)$) such that

  • $P(X_0=0)=1$, $E[X_t^2]=t$, $E[X_t]=0$ for any $t\ge 0$
  • the increments are independent, that is $X_{t_n}-X_{t_{n-1}},\dots,X_{t_1}-X_{t_0}$ are independent for any finite sequence $0\le t_0<t_1<\dots<t_n$
  • for any $h>0$ the law of $X_{t+h}-X_t$ depends only on $h$
  • $E[|X_t|^3]\le C$ for any $t\in [0,\delta]$, where $C$ and $\delta$ are some positive constants (this is the only technical hypothesis).

I read that there is an elementary proof of this characterization (i.e. that such a process is a Brownian Motion) based on the central limit theorem (in fact, the only thing to show is that the increments have normal distribution, so it is reasonable that CLT could help), but I fail to see how this can be done. Do you have any ideas or references?

Update: Apparently, the last technical hypothesis is not necessary. See my answer below.

2

There are 2 best solutions below

2
On BEST ANSWER

As mentioned in the comments, we have to show that for each $t$, the random variable $X_t$ is Gaussian.

A first approach will be the following: we write for a fixed $t$, $$X_t=\sum_{j=1}^n X_{ t\frac jn} - X_{ t\frac{j-1}n}=:\sum_{j=1}^n Y_{n,j} $$ and show that the sequence $\left(\sum\limits_{j=1}^n Y_{n,j}\right)_n$ satisfies Lindeberg's condition for the central limit theorem, i.e., we have to prove that $$\tag{Lindeberg} \mbox{ for each positive } \varepsilon ,\quad n\mathbb E[X_{t /n}^2 \chi\{| X_{t /n}|\gt \varepsilon\} ]=0. $$

Claim. The sequence $(n\mathbb E[|X_{t/n}|^3 ] )_{n\geqslant n_0} $ is bounded, where $n_0$ is such that $t/n_0\lt\delta$.

To see this, we use Rosenthal's inequality with the independent and centered family of random variables $(X_{tj/n}-X_{t(j-1)/n} )_{j=1}^n $ and the exponent $3$. We get that $$\sum_{j=1}^n\mathbb E|X_{tj/n}-X_{t(j-1)/n} |^3 \leqslant K\mathbb E[|X_t|^3 ].$$ The term $\mathbb E[|X_t|^3 ]$ is finite even if $t$ is greater than $\delta$ (we write $X_t$ as a finite sum of increments $X_{t_i}-X_{t_{i-1} } $, and $t_i-t_{i-1}\leqslant\delta$). We thus obtain by stationarity of the increments that $$n\mathbb E|X_{t/n}|^3\leqslant KC(t), $$ where $C(t)$ depends only on $t$. This proves the claim.

Lemma. Condition (Lindeberg) is satisfied if for each positive $\varepsilon$, $n\mathbb P(|X_{t/n}|\gt\varepsilon )\to 0$.

Indeed, we have for each $n$ and $R$, $$n\mathbb E[X_{t /n}^2 \chi\{| X_{t /n}|\gt \varepsilon\} ]\leqslant \frac nR\mathbb E|X_{t/n} |^3 +R^2n\mathbb P(|X_{t/n} |\gt\varepsilon )$$ and by the claim, $$n\mathbb E\left[X_{t /n}^2 \chi\{| X_{t /n}|\gt \varepsilon\}\right]\leqslant \frac{ KC(t)}R +R^2n\mathbb P(|X_{t/n} |\gt\varepsilon ).$$ To conclude the proof, we notice that $$1-\left(1-P(|X_{t/n}|>\varepsilon)\right)^n=P\left(|Y_{n,i}|>\varepsilon\text{ for some }1\le i\le n\right)\to 0$$ as $n\to\infty$ by the uniform continuity of $s\mapsto X_s$ on $[0,t]$. Thus $n\log\left(1-P(|X_{t/n}|>\varepsilon)\right)\to 0$, but $$n\log\left(1-P(|X_{t/n}|>\varepsilon)\right)\le -nP(|X_{t/n}|>\varepsilon)\le 0,$$ so we finally get $nP(|X_{t/n}|>\varepsilon)\to 0$, which completes the proof.

0
On

I just found another proof (not mine) which does not use the last hypothesis on the third moments, so I'm writing it here for its own interest.

We use two simple facts:

  1. For any $t>0$ the characteristic function of $X_t$ never vanishes: $\mathbb{E}[e^{iuX_t}]\neq 0$ (see here). So there is a well-defined choice of a logarithm, $u\mapsto \ln\mathbb{E}[e^{iuX_t}]$, such that this map is continuous (as $u\in\mathbb{R}$ varies) and null at $u=0$. We will implicitly refer to this choice.
  2. Fix a time $T>0$. As Davide already showed at the end of his answer, for any $\epsilon$ we have $n\mathbb{P}(|X_{T/n}|>\epsilon)\to 0$ as $n\to\infty$.

From the first fact, since $\ln \mathbb{E}[e^{iuX_T}]=\ln \mathbb{E}[e^{iuX_{T/n}}]^n=n\ln\mathbb{E}[e^{iuX_{T/n}}]$, we have $\ln\mathbb{E}[e^{iX_{T/n}}]=\frac{1}{n}\ln\mathbb{E}[e^{iX_T}]$, so that $$\mathbb{E}[e^{iX_{T/n}}]=\exp(\frac{1}{n}\ln\mathbb{E}[e^{iX_T}])=1+\frac{1}{n}\ln\mathbb{E}[e^{iX_T}]+O(\frac{1}{n^2})$$ and finally $$n\mathbb{E}[e^{iX_{T/n}}-1]\to\ln\mathbb{E}[e^{iX_T}]\ \ \ \ (*)$$

Now the second fact implies that

For any bounded function $f:\mathbb{R}\to\mathbb{R}$ such that $f(x)=o(x^2)$ we have $n\mathbb{E}[f(X_{T/n})]\to 0$

To prove this put $M:=\|f\|_\infty$ and fix any $\epsilon'>0$. Then we can find some $\epsilon>0$ s.t. for any $|x|\le\epsilon$ we have $|f(x)|\le\epsilon'x^2$, thus $n|\mathbb{E}[f(X_{T/n})]|\le n|\mathbb{E}[f(X_{T/n})1_{|X_{T/n}|>\epsilon}]|+n|\mathbb{E}[f(X_{T/n})1_{|X_{T/n}|\le\epsilon}]|$. But the first term tends to $0$ (as it is $\le nM\mathbb{P}(|X_{T/n}|>\epsilon)$), while the second term is $\le n\epsilon'\mathbb{E}[X_{T/n}^2]=\epsilon'T$. Since $\epsilon'$ is arbitrarily small, this proves our claim.

In particular, choosing $f(x):=\cos (x)-1+\frac{x^2}{2}1_{|x|\le 1}$ and taking the real part of $(*)$, we deduce that the limit $\lim_{n\to\infty}n\mathbb{E}[X_{T/n}^2 1_{|X_{T/n}|\le 1}]=:A$ exists (and is finite). In the same way (using the imaginary part of $(*)$) $\lim_{n\to\infty}n\mathbb{E}[X_{T/n} 1_{|X_{T/n}|\le 1}]=:\gamma$ exists.

Finally, with $f(x):=e^{iux}-e^{iux}1_{|x|\le 1}$, we get $$ g(u):=\ln\mathbb{E}[e^{iuX_T}]=n\ln\mathbb{E}[e^{iuX_{T/n}}]=n\ln\left(\mathbb{E}[e^{iuX_{T/n}}1_{|X_{T/n}|\le 1}]+o\left(\frac{1}{n}\right)\right) $$ and with $f(x):=e^{iux}1_{|x|\le 1}-(1+iux-\frac{u^2}{2}x^2)1_{|x|\le 1}$ (now $u\in\mathbb{R}$ is fixed) we obtain $$ g(u)=n\ln\left(\mathbb{E}[(1+iuX_{T/n}-\frac{u^2}{2}X_{T/n}^2)1_{|X_{T/n}|\le 1}]+o\left(\frac{1}{n}\right)\right)=n\ln\left(1+\frac{iu\gamma}{n}-\frac{u^2}{2n}A+o\left(\frac{1}{n}\right)\right)=n\left(\frac{iu\gamma}{n}-\frac{u^2}{2n}A+o\left(\frac{1}{n}\right)\right)\to iu\gamma-\frac{u^2}{2}A $$ and this proves that $X_T$ has gaussian density.

(I found this proof sketched in some lecture notes by Peter Tankov.)