How to easily see the time integral of a Brownian motion is normally distributed?

1.2k Views Asked by At

It's well known that $X_t:=\int_0^tB_\tau d\tau$ where $\{B_\tau\}$ is a 1D standard Brownian motion is distributed as $N(0, t^3/3)$. Is there any "immediate" way to see this fact?

The easiest one I can get: discretise $X_t$ into Riemannian sum, and break each $B_{\tau_i}$ into independent increments over tiny intervals, then we see the sum is just a linear sum of independent normal distributions, then take the limit and use convergence in distribution to conclude.

I wouldn't say this is hard, but this is in no way trivial or immediate either. Can we somehow see this fact without any effort or whatsoever? Thanks.

1

There are 1 best solutions below

2
On BEST ANSWER

There are three things which need to be done:

  1. Check that $X_t$ is Gaussian.
  2. Compute its expectation.
  3. Compute its variance.

1. $X_t$ is Gaussian: It seems to me that the most natural (and also direct) way to prove this is an approximation by Riemann sums. Noting that

$$X_t^{(n)} := \sum_{j=1}^n \frac{1}{n} B_{t j/n} \tag{1}$$

is Gaussian for each $t>0$ (because $(B_t)_{t \geq 0}$ is a Gaussian process), we find that $$X_t = \lim_{n \to \infty} X_t^{(n)}$$ is Gaussian as pointwise limit of Gaussian random variables. For an alternative reasoning see the very end of my answer.

2. Compute $\mathbb{E}(X_t)$: Since $\mathbb{E}(B_s)=0$ for all $s \geq 0$, it follows that each $X_t^{(n)}$ (defined in $(1)$) has expectation zero, and hence its limit $X_t = \lim_n X_t^{(n)}$ has expectation zero. Alternatively, we can apply Fubini's theorem:

$$\mathbb{E}(X_t) = \mathbb{E} \left( \int_0^t B_s \, ds \right) = \int_0^t \underbrace{\mathbb{E}(B_s)}_{=0} \, ds =0.$$

3. Compute $\text{var}(X_t)$: Since we already know that $\mathbb{E}(X_t)=0$, we clearly have $\text{var}(X_t) = \mathbb{E}(X_t^2)$. As

$$X_t^2 = \int_0^t \int_0^t B_s B_r \, ds \, dr$$

it follows from Fubini's theorem that

$$\mathbb{E}(X_t^2) = \int_0^t \int_0^t \mathbb{E}(B_s B_r) \, ds \, dr. \tag{2}$$

By symmetry, we thus get

$$\mathbb{E}(X_t^2) = 2 \int_0^t \int_0^r \underbrace{\mathbb{E}(B_s B_r)}_{\min\{s,r\}=s} \, ds \, dr = 2 \int_0^t \int_0^r s \, ds = \frac{t^3}{3}$$

If you don't like symmetrization, then note that (2) implies

$$\mathbb{E}(X_t^2) = \int_0^t \int_0^s \underbrace{\mathbb{E}(B_s B_r)}_{\min\{s,r\}=s} \, ds \, dr + \int_0^t \int_s^t \underbrace{\mathbb{E}(B_s B_r)}_{\min\{s,r\}=r} \, ds \, dr$$

and each of the integrals can be calculated explicitly using standard calculus.


Let me close this answer with a result which combines all the three steps into one.

Proposition: Let $(L_t)_{t \geq 0}$ be a Lévy process with characteristic exponent $\psi$, i.e. $$\mathbb{E}\exp(i \xi L_t) = \exp(-t \psi(\xi)), \qquad t \geq 0, \xi \in \mathbb{R}. \tag{3}$$ Then the characteristic function of $$X_t := \int_0^t L_s \, ds$$ equals $$\phi(\xi) = \exp \left(- \int_0^t \psi(\xi s) \, ds \right), \qquad \xi \in \mathbb{R}.$$

If you are not familar with Lévy processes (that it, stochastic processes with independent and stationary increments), then you can just think of a Brownian motion; in this case $\psi$ is given by $\psi(\xi) = \xi^2/2$. Applying the proposition, we thus find that the characteristic function of $X_t = \int_0^t B_s \, ds$ equals $$\exp \left(- \frac{t^3}{3} \frac{\xi^2}{2} \right)$$ which is the characteristic function of $N(0,t^3/3)$, and so $X_t \sim N(0,t^3/3)$.

Proof of the proposition: For fixed $n \in \mathbb{N}$ and $t>0$ set $t_j := t j/n$ for $j=1,\ldots,n$, and set

$$\phi_n(\xi) := \mathbb{E} \exp \left( i \xi \frac{1}{n} \sum_{j=1}^n L_{t_j} \right).$$

Denote by $\mathcal{F}_t := \sigma(L_s; s \leq t)$ the canonical filtration of $(L_t)_{t \geq 0}$. Using the tower property of the conditional expectation, we find

$$\begin{align*} \phi_n(\xi) &= \mathbb{E} \bigg\{ \mathbb{E} \bigg[ \exp \left( i \xi \frac{1}{n} \sum_{j=1}^n L_{t_j} \right) \mid \mathcal{F}_{t_{n-1}} \bigg] \bigg\} \\ &= \mathbb{E} \bigg\{ \exp \left( i \xi \frac{1}{n} \sum_{j=1}^{n-1} L_{t_j} \right) \mathbb{E} \bigg[ \exp \left( i \xi \frac{1}{n} L_{t_n} \right) \mid \mathcal{F}_{t_{n-1}} \bigg] \bigg\} \tag{4} \end{align*}$$

Since $(L_t)_{t \geq 0}$ has independent increments, we have

$$\begin{align*} \mathbb{E} \bigg[ \exp \left( i \xi \frac{1}{n} L_{t_n} \right) \mid \mathcal{F}_{t_{n-1}} \bigg] &=\exp(i \xi/n L_{t_{n-1}}) \mathbb{E} \bigg[ \exp \left( i \xi \frac{1}{n} (L_{t_n}-L_{t_{n-1}}) \right) \mid \mathcal{F}_{t_{n-1}} \bigg] \\ &= \exp(i \xi/n L_{t_{n-1}}) \mathbb{E}\exp\left( i \xi \frac{1}{n} (L_{t_n}-L_{t_{n-1}}) \right). \end{align*}$$

Using that $(L_t)_{t \geq 0}$ has stationary increments, i.e. $L_{t_n}-L_{t_{n-1}} \sim L_{t_n-t_{n-1}}=L_{1/n}$, and using $(3)$ we thus get

$$ \mathbb{E} \bigg[ \exp \left( i \xi \frac{1}{n} L_{t_n} \right) \mid \mathcal{F}_{t_{n-1}} \bigg] = \exp(i \xi/n L_{t_{n-1}}) \exp \left(- \frac{1}{n} \psi \left( \frac{\xi}{n} \right) \right).$$

Plugging this into $(4)$, we obtain that

$$\phi_n(\xi) = \mathbb{E} \bigg\{ \exp \left( i \xi \frac{1}{n} \sum_{j=1}^{n-2} L_{t_j} + i \xi \frac{2}{n} L_{t_{n-1}} \right) \exp \left(- \frac{1}{n} \psi \left( \frac{\xi}{n} \right) \right) .$$

Iterating this reasoning (i.e. next conditioning on $\mathcal{F}_{t_{n-2}}$, then on $\mathcal{F}_{t_{n-3}}$, ...) we conclude that

$$\phi_n(\xi) = \exp \left( - \frac{1}{n} \sum_{j=1}^n \psi \left( \frac{j}{n} \xi \right) \right).\tag{5}$$

Finally, we note that

$$X_t \stackrel{\text{def}}{=} \int_0^t L_s \, ds = \lim_{n \to \infty} \frac{1}{n} \sum_{j=1}^n L_{tj/n},$$

and thus, by letting $n \to \infty$ in (5), we get

$$\mathbb{E}\exp(i \xi X_t) = \exp \left(- \int_0^t \psi(s \xi) \, ds \right).$$