A well-known characterization of the Brownian Motion says that it is the only continuous process $X_t$ (defined on $[0,\infty)$) such that
- $P(X_0=0)=1$, $E[X_t^2]=t$, $E[X_t]=0$ for any $t\ge 0$
- the increments are independent, that is $X_{t_n}-X_{t_{n-1}},\dots,X_{t_1}-X_{t_0}$ are independent for any finite sequence $0\le t_0<t_1<\dots<t_n$
- for any $h>0$ the law of $X_{t+h}-X_t$ depends only on $h$
- $E[|X_t|^3]\le C$ for any $t\in [0,\delta]$, where $C$ and $\delta$ are some positive constants (this is the only technical hypothesis).
I read that there is an elementary proof of this characterization (i.e. that such a process is a Brownian Motion) based on the central limit theorem (in fact, the only thing to show is that the increments have normal distribution, so it is reasonable that CLT could help), but I fail to see how this can be done. Do you have any ideas or references?
Update: Apparently, the last technical hypothesis is not necessary. See my answer below.
As mentioned in the comments, we have to show that for each $t$, the random variable $X_t$ is Gaussian.
A first approach will be the following: we write for a fixed $t$, $$X_t=\sum_{j=1}^n X_{ t\frac jn} - X_{ t\frac{j-1}n}=:\sum_{j=1}^n Y_{n,j} $$ and show that the sequence $\left(\sum\limits_{j=1}^n Y_{n,j}\right)_n$ satisfies Lindeberg's condition for the central limit theorem, i.e., we have to prove that $$\tag{Lindeberg} \mbox{ for each positive } \varepsilon ,\quad n\mathbb E[X_{t /n}^2 \chi\{| X_{t /n}|\gt \varepsilon\} ]=0. $$
To see this, we use Rosenthal's inequality with the independent and centered family of random variables $(X_{tj/n}-X_{t(j-1)/n} )_{j=1}^n $ and the exponent $3$. We get that $$\sum_{j=1}^n\mathbb E|X_{tj/n}-X_{t(j-1)/n} |^3 \leqslant K\mathbb E[|X_t|^3 ].$$ The term $\mathbb E[|X_t|^3 ]$ is finite even if $t$ is greater than $\delta$ (we write $X_t$ as a finite sum of increments $X_{t_i}-X_{t_{i-1} } $, and $t_i-t_{i-1}\leqslant\delta$). We thus obtain by stationarity of the increments that $$n\mathbb E|X_{t/n}|^3\leqslant KC(t), $$ where $C(t)$ depends only on $t$. This proves the claim.
Indeed, we have for each $n$ and $R$, $$n\mathbb E[X_{t /n}^2 \chi\{| X_{t /n}|\gt \varepsilon\} ]\leqslant \frac nR\mathbb E|X_{t/n} |^3 +R^2n\mathbb P(|X_{t/n} |\gt\varepsilon )$$ and by the claim, $$n\mathbb E\left[X_{t /n}^2 \chi\{| X_{t /n}|\gt \varepsilon\}\right]\leqslant \frac{ KC(t)}R +R^2n\mathbb P(|X_{t/n} |\gt\varepsilon ).$$ To conclude the proof, we notice that $$1-\left(1-P(|X_{t/n}|>\varepsilon)\right)^n=P\left(|Y_{n,i}|>\varepsilon\text{ for some }1\le i\le n\right)\to 0$$ as $n\to\infty$ by the uniform continuity of $s\mapsto X_s$ on $[0,t]$. Thus $n\log\left(1-P(|X_{t/n}|>\varepsilon)\right)\to 0$, but $$n\log\left(1-P(|X_{t/n}|>\varepsilon)\right)\le -nP(|X_{t/n}|>\varepsilon)\le 0,$$ so we finally get $nP(|X_{t/n}|>\varepsilon)\to 0$, which completes the proof.