Show that a process is gaussian

1.1k Views Asked by At

I need an help with the following exercise. I would like to know if what I've done is correct.

Let $(X_t)_{t\geq 0}$ be the process define as $$X_t=e^{\lambda t} X_0-\sigma \Big(\lambda \int_0^t e^{-\lambda(t-s)} \,W_s ds +W_t\Big),$$ where $(W_t)$ is the Wiener process, $\sigma, \lambda$ are positive constants and $X_0\sim N(\mu,\gamma^2)$ is independent from $(W_t)$.

We want to prove that it is a gaussian process, i.e. all marginal distributions are gaussian measures.

Now consider a one-dimensional marginal, $\mu_t$, for a fixed $t\geq 0$, which is defined as $\mu_t(I)=\mathbb P(X_t\in I) \quad \forall I\in \mathcal B(\mathbb R)$: we want to show that it is gaussian.

So let us write the integral in the definition of $X_t$ in the following way: $$\int_0^t e^{-\lambda(t-s)} \,W_s ds=\int_0^t W_s\, df(s),$$ where $f(s)=\frac{1}{\lambda}e^{-\lambda(t-s)}$ and the last integral is the Stieltjes integral. We have that, integrating by parts, $$\int_0^t W_s\, df(s)=W_t-\int_0^t f(s) dW_s=W_t-\frac{1}{\lambda}\int_0^t e^{-\lambda(t-s)} dW_s.$$

Denote with $\Pi=\{0=t_0>t_1<\dots<t_n=t\}$ any partition of the interval $[0,t]$ and let $|\Pi|=\max |t_{i+1}-t_i|$. Then $$\int_0^t e^{-\lambda(t-s)} dW_s:=\lim_{|\Pi|\to 0} \sum_{k=0 }^{n-1} f(t_k)(W_{t_{k+1}}-W_{t_{k}}).$$

Thus $$X_t=e^{\lambda t} X_0-\sigma \Big((\lambda+1)W_t- \int_0^t e^{-\lambda(t-s)} dW_s\Big)= \\ \qquad \,=e^{\lambda t} X_0-\sigma \lim_{|\Pi|\to 0} \sum_{k=0 }^{n-1} (\lambda+1+f(t_k))(W_{t_{k+1}}-W_{t_{k}}).$$

Now, using the fact that increments of Wiener process are independent, one can show that $$ \lim_{|\Pi|\to 0} \sum_{k=0 }^{n-1} (\lambda+1+f(t_k))(W_{t_{k+1}}-W_{t_{k}})\sim N\Big(0,\int_0^t (\lambda+1+f(s))^2 ds \Big).$$

So $X_t$ is gaussian and then $\mu_t$ is a gaussian measure. Is it correct up to now?

Now, my problem is: how can I prove that also the $n$-dimensional marginal distributions are gaussian? If I know that $(X_t)_t$ are independent, then we are done. But are they independent?

1

There are 1 best solutions below

0
On BEST ANSWER

As you observed**, Integration by parts justifies

$$ X_t = e^{\lambda t} X_0-\sigma \Big(W_t-\int\limits_{0}^{t} e^{-\lambda(t-s)}\mathrm dW_s +W_t\Big) = e^{\lambda t} X_0-\sigma \Big(2W_t-\int\limits_{0}^{t} e^{-\lambda(t-s)}\mathrm dW_s\Big) = e^{\lambda t} X_0-\sigma \Big(2\int\limits_{0}^{t}\mathrm dW_s-\int\limits_{0}^{t} e^{-\lambda(t-s)}\mathrm dW_s\Big) = e^{\lambda t} X_0-\sigma \Big(\int\limits_{0}^{t}\left(2 - e^{-\lambda(t-s)}\right)\mathrm dW_s\Big). $$

Now, $X_0$ is independent of $(W_t)_{t\geqslant 0}$. Therefore, $e^{\lambda t} X_0$ is independent of $\sigma \Big(\int\limits_{0}^{t}\left(2 - e^{-\lambda(t-s)}\right)\mathrm dW_s\Big)$. Furthermore, their respective distributions are univariate gaussian:

$$ e^{\lambda t} X_0\sim\mathcal{N}\left(\mu e^{\lambda t}, \gamma^2e^{2\lambda t}\right),\\\sigma \Big(\int\limits_{0}^{t}\left(2 - e^{-\lambda(t-s)}\right)\mathrm dW_s\Big)\sim\mathcal{N}\left(0 , \sigma^2\int\limits_{0}^{t}\left(2 - e^{-\lambda(t-s)}\right)^2\mathrm ds\right). $$

As independent univariate gaussians, any linear combination of these is also univariate gaussian. This immediately means that $X_t$ is the univariate gaussian given by their difference.


This also has implications for the finite dimensional distributions of the $(X_t)_{t\geqslant 0}$ process. Consider any finite collection of distinct times $0< t_1 < \ldots < t_n$ and arbitrary real numbers $\alpha_1, \ldots, \alpha_n$. We have:

$$ \alpha_1 X_{t_1} + \ldots + \alpha_n X_{t_n} = X_{0}\sum\limits_{i = 1}^{n}\alpha_ie^{\lambda t_i} - \\\sigma\Bigg(\int\limits_{0}^{t_1}\sum\limits_{i=1}^{n}\alpha_if(t_i , s)\mathrm dW_s + \int\limits_{t_1}^{t_2}\sum\limits_{i=2}^{n}\alpha_if(t_i , s)\mathrm dW_s + \ldots + \int\limits_{t_{n-1}}^{t_n}\alpha_nf(t_n , s)\mathrm dW_s\Bigg) $$ where, for notational convenience, we used $f(t_i,s) := 2-e^{-\lambda(t_i-s) }$.

Look carefully now: each term $\int\limits_{t_j}^{t_{j-1}}\sum\limits_{i=j}^{n}\alpha_if(t_i , s)\mathrm dW_s$ above is a univariate gaussian independent of the other similar terms and $X_0$. Consequently, $\alpha_1 X_{t_1} + \ldots + \alpha_n X_{t_n}$ is a univariate gaussian, as it is a linear combination of independent univariate gaussians. So, each finite-dimensional distribution $\left(X_{t_1} \ldots, X_{t_n}\right)$ is multivariate gaussian, by definition. We are done.

** p.s. I think you have a "$\lambda$" in your final gaussian variance term that should not be there, since earlier on in your argument you intended to use $$\mathrm d\left(e^{-\lambda(t-s)}\right) = \lambda e^{-\lambda(t-s)}\mathrm ds.$$ Also, there is a "$+$" in your final gaussian variance term that should be a "$-$".