Proof of $n-$dimensional Brownian motion identities for components of $B_t$

216 Views Asked by At

The person in this post (Proving Kolmogorov's continuity condition holds for Brownian motion?) used the following two identities in their proof for an $n$ dimensional Brownian motion: $$ \mathbb{E}((B_{t,i}-B_{s,i})^4) = 3(t-s)^2$$ and $$ \mathbb{E}((B_{t,i}-B_{s,i})^2(B_{t,j}-B_{s,j})^2) = (t-s)^2$$ $$ $$ For the first identity I will use that $B_t$ is a Gaussian process so that for any $u_1, u_2 \in \mathbb{R}$ we must have that $$ \mathbb{E}(\exp(i u_1 B_{t,j} + iu_2 B_{s,j})) = \exp\Big(-\frac{1}{2}\sum_{k,m=1}^2u_k c_{km}u_m + i(u_1 M + u_2M)\Big)$$ where $\mathbb{E}(B_{t,j})=\mathbb{E}(B_{s,j}) = M$ and $C = [c_{k,m}]$ is the covariance matrix of $(B_{s,j},B_{t,j})$. Choosing $u_1 = u$ and $u_2 =-u$ for any $u\in\mathbb{R}$ and assuming that $s\leq t$, we are left with $$ \mathbb{E}(\exp(i u( B_{t,j} - B_{s,j}))) = \exp\Big(-\frac{1}{2}u^2(t-2s+s)\Big) =\exp\Big(-\frac{1}{2}u^2(t-s)\Big) $$ Now taking the Taylor Expansion of both sides we are left with $$ \sum_{n=0}^\infty \frac{(iu)^n}{n!}\mathbb{E}((B_{t,j}-B_{s,j})^n) = \sum_{k=0}^\infty \frac{(-1)^ku^{2k}(t-s)^{k}}{k!2^k}$$ Now the right hand side is strictly real, so the left handside must also be strictly real. Thus we can do some trickery and conclude that the sum of the imaginary components is 0. Hence we may rewrite as the following $$ \sum_{k=0}^\infty \frac{(-1)^k u^{2k}}{(2k)!}\mathbb{E}((B_{t,j}-B_{s,j})^{2k}) = \sum_{k=0}^\infty \frac{(-1)^ku^{2k}(t-s)^{k}}{k!2^k}$$ This must hold for all $u\in\mathbb{R}$, by the Linear Independence of polynomials we conclude that $$ \mathbb{E}((B_{t,j}-B_{s,j})^{2k}) = \frac{(2k)!}{k!2^k}(t-s)^{k}$$ In particular for $k=2$ we are left with $$ \mathbb{E}((B_{t,j}-B_{s,j})^{2k}) = 3(t-s)^2$$

$$ $$ For the second statement I understand that the component of $B_t$ are independent and thus uncorrelated. I want to be able to do something as simple as $$ \mathbb{E}((B_{t,i}-B_{s,i})^2(B_{t,j}-B_{s,j})^2) = \mathbb{E}((B_{t,i}-B_{s,i})^2)\ \mathbb{E}((B_{t,j}-B_{s,j})^2) = (t-s)^2$$ In order to do that I need to show that $B_{t,i}-B_{s,i}$ and $B_{t,j}-B_{s,j}$ are independent and thus $(B_{t,i}-B_{s,i})^2 $ and $(B_{t,j}-B_{s,j})^2$ are independent as well. All I have to go on is that $B_{t,i}$ and $B_{t,j}$ are independent variables and similarly $B_{s,i}$ and $B_{s,j}$ are as well. It seems that theres a simple transformation rule that I am missing.

Edit: Including this link because it shows the claim that $\sigma(B_{t,i}:t\geq 0)$ and $\sigma(B_{t,j}:t\geq 0)$ are independent if $i\not=j$. Showing that two Gaussian processes are independent

1

There are 1 best solutions below

5
On BEST ANSWER

All I have to go on is that $B_{t,i}$ and $B_{t,j}$ are independent variables and similarly $B_{s,i}$ and $B_{s,j}$ are as well.

That is not all you have to go on! The definition of $n$ dimensional Brownian motion includes the property that the processes $(B_{t,i} : t \ge 0)$ and $(B_{t,j} : t \ge 0)$ are independent (indeed, mutually independent as $i$ varies). This means that the $\sigma$-fields $\sigma(B_{t,i} : t \ge 0)$ and $\sigma(B_{t,j} : t \ge 0)$ are independent; in particular, given any multivariate Borel functions $f,g$ and any finite number of indices $t_1, \dots, t_n$, $s_1, \dots, s_m$, the random variables $$f(B_{t_1, i}, \dots, B_{t_n, i}), \quad g(B_{s_1,j}, \dots, B_{s_m,j})$$ are independent. (It also holds for countably many indices once you define what that means.)

Observe that this is much stronger than merely saying that $B_{t,i}, B_{t,j}$ are independent for each $t$.

So in fact, $B_{t,i}-B_{s,i}$ is independent of $B_{t,j}-B_{s,j}$, and so your proposed argument is perfectly well justified.


The first problem really has very little to do with Brownian motion. You know that $B_{t,i}-B_{s,i}$ has a normal distribution with mean 0 and variance $t-s$, so this is really just asking you to compute the fourth moment of a normal random variable, i.e. $E[(\sqrt{t-s} Z)^4] = (t-s)^2 E[Z^4]$ where $Z \sim N(0,1)$. There are several ways to verify that $E[Z^4]=3$: write down an integral involving the Gaussian density and integrate by parts four times; use the moment generating function or characteristic function; or look it up on Wikipedia.