Showing independence of increments of a stochastic process

1.2k Views Asked by At

The textbook on stochastic calculus I am now reading says that if $X\colon [0,\infty)\times\Omega\rightarrow\mathbb R$ is a stochastic process such that

  1. $X(t)-X(s)\sim N(0,t-s)$ for all $t \geq s \geq 0$,

  2. $E[X(t)X(s)]=\min\{s,t\}$ for all $s,t \geq 0$,

then, $X$ exhibits independence increment, i.e. for every $0 \leq t_1<...<t_n$, $X(t_1)$, $X(t_2)-X(t_1)$, …, $X(t_n)-X(t_{n-1})$ are independent.

Here $X(t)$ denotes a random variable $X(t):\Omega\rightarrow \mathbb R$ such that $X(t)(\omega)=X(t,\omega)$.

But I guess this is not true. I suspect that we need an additional condition that $X$ is a Gaussian process. (Then, it is easy to show the independence)

Am I on the right track? If so, can you give me some counterexamples?

Or can it be shown without assuming Gaussian process?

Any hint would be appreciated! Thanks and regards.

2

There are 2 best solutions below

1
On

Let $t_0=0$ and look at the sequence $Y_i=X(t_i)-X(t_{i-1})$ for $i\in[1:n]$. These are Gaussian of mean $0$ and respective variance $t_i-t_{i-1}$. Let's look at all covariances, for $i< j$ (without loss of generality) \begin{align*} \mathbb{E}[Y_i Y_j] &= \mathbb{E}[(X(t_i)-X(t_{i-1})(X(t_j)-X(t_{j-1}))]\\ &=\mathbb{E}[X(t_i)X(t_j)]-\mathbb{E}[X(t_{i-1})X(t_j)]-\mathbb{E}[X(t_i)X(t_{j-1})]+\mathbb{E}[X(t_{i-1})X(t_{j-1})]\\ &=t_i-t_{i-1}-t_i+t_{i-1}\\ &=0 \end{align*} So the covariance matrix of $(Y_1,\dots,Y_n)$ is a diagonal matrix and so they are all independents. You should be able to conclude that you don't need $X$ to be a Gaussian process from there, the only difference is that you add $X(0)$ to the first element and that $X(0)$ is uncorrelated with all $Y_i$ for $i\in[2:n]$.

The only problem I can see there is if $t_1=0$ but in this case you can reduce $n$ by one and remove the first element to prove your result.


All this is true only if $Y=(Y_1,\dots,Y_n)$ is jointly Gaussian. A sufficient condition of $Y$ to be joint Gaussian is that for any vector $\mathbf a$ of size $n$, $\mathbf a^T Y$ is a Gaussian random variable.

I am not sure on how to prove that but some thing of use may be that $Y_i+Y_{i+1}+\dots + Y_{j-1}+Y_{j} = X(t_{j})-X(t_{i-1})$ is Gaussian for any $i<j$.

9
On

I believe you can use a similar trick that is used for the Levi characterisation of BM namely applying ito on the characteristic function of $X_t$ and then using the iterated expectations to show the independence of the various $X_{t_i}-X_{t_j}$ again by showing the characteristic function expectation factorizes.

See this article for details

http://individual.utoronto.ca/normand/Documents/MATH5501/Project-3/Levy_characterization_of_Brownian_motion.pdf