Independence of the components of a multidimensional Brownian motion

1k Views Asked by At

Let $B = (B^1, \dots, B^n)$ be an $n$-dimensional ($n \in \{1, 2, \dots\}$) Brownian motion (i.e. $B = (B_t)_{t \geq 0} \in \Omega \rightarrow (\mathbb{R}^n)^{[0,\infty)}$ has continuous paths, $B_0 = 0$ almost surely, $B$ has independent increments, and, for every $s, t \geq 0$ with $s < t$, $B_t - B_s \sim N(0, (t - s) E_n)$, where $E_n$ is the identity matrix in $\mathbb{R}_{n, n}$) adapted to the filtration $\mathfrak{F}$ (which is not necessarily the natural filtration). Is it the case that the components $B^k$ are independent?


EDIT

I have written all my thoughts on the topic in my answer below.

1

There are 1 best solutions below

0
On BEST ANSWER

The following answer is based on the proof of theorem 7.8 ("Multidimensional Brownian Motion") in Jochen Wengenroth's textbook "Wahrscheinlichkeitstheorie" (de Gruyter, 2008), p. 136.

We will show, w.l.g., that $B^1 = (B^1_t)_{t \geq 0}$ and $C := (B^2_t, \dots, B^n_t)_{t \geq 0}$ ($B^1$ is a one-dimensional stochastic process and $C$ is an $n - 1$-dimensional stochastic process) are independent. If $\mathfrak{b}$ is a $\pi$-system that generates $\sigma(B^1_t :\mid t \geq 0)$, and if $\mathfrak{c}$ is a $\pi$-system that generates $\sigma(C_t :\mid t \geq 0)$, it suffices to show that $\mathfrak{b}$ and $\mathfrak{c}$ are independent. Denote by $\mathcal{P}_0$ the collection of all non-empty, finite subsets of $[0, \infty)$. Then $\bigcup\{\sigma(B^1_J) = \sigma(B^1_{j_0}, \dots, B^1_{j_m}) :\mid J = \{j_0, \dots, j_m\} \in \mathcal{P}_0\}$ is a $\pi$-system that generates $\sigma(B^1)$, and $\bigcup\{\sigma(C_J) = \sigma(C_{j_0}, \dots, C_{j_m}) :\mid J = \{j_0, \dots, j_m\} \in \mathcal{P}_0\}$ is a $\pi$-system that generates $\sigma(C)$.

Let then $J, J' \in \mathcal{P}_0$. We will show that $\sigma(B^1_J)$ and $\sigma(C_{J'})$ are independent, or, equivalently, that $B^1_J$ and $[C_{J'}]$ are independent, where the notation $[v_0, \dots, v_k]$ denotes the vector obtained by concatenating the components of the individual vectors $v_0, \dots, v_k$. Define $I := J \cup J'$. Then it suffices to show that $B^1_I$ and $[C_I]$ are independent.

Write $I = \{i_0 < \dots < i_m\}$ and denote by $\beta$ the $(m+1)n$ vector $[B_{i_0}, B_{i_1} - B_{i_0}, \dots, B_{i_m} - B_{i_{m - 1}}]$. $\beta$ is multivariate normal, since, due to the fact that $B$ has independent increments, any linear combination of $\beta$'s components is a one-dimensional normal random variable. Since $\beta$'s components are uncorrelated, they are independent. Hence $\alpha := (B^1_{i_0}, B^1_{i_1} - B^1_{i_0}, \dots, B^1_{i_m} - B^1_{i_{m - 1}})$ and $\gamma := [C_{i_0}, C_{i_1} - C_{i_0}, \dots, C_{i_m} - C_{i_{m - 1}}]$ are independent. Since $B^1_I$ is a linear transformation of $\alpha$ and $[C_I]$ is a linear transformation of $\gamma$, $B^1_I$ and $[C_I]$ are independent, Q.E.D.