Currently finishing the last year of PhD in statistics, we wonder if you could help us with the following.
Let $T = [0,1]$ and $X = \left( X_{t}, t \in T \right)$ be a gaussian process with mean function $m$ and covariance function $W$. Parentheses in functions are omitted in this notation. We write $X \sim GP(m,W)$.
In a first time, suppose that the sample paths $t \mapsto X_{t}$ are continuous a.s. and let $Y$ be the stochastic process defined on $T$ by : $$Y_{t} = \int_{0}^{t} X(u) du,\mbox{ a.s.}.$$
The question is : what can you say about the law of $Y$ ? In particular, do you believe it to be $GP(m^{\star},W^{\star})$, given by : $$\begin{array}{ccc} m^{\star}(t) & = & \int_{0}^{t} m(u)du,\\ W^{\star}(s,t) & = & \int_{0}^{s} \int_{0}^{t} W(u,v)dudv. \end{array}$$
If the preceding statement is true, could you please give us some references?
In a second time, suppose that the sample paths $t \mapsto X_{t}$ are differentiable a.s. and let $Z$ be the stochastic process defined on $T$ by : $$Z_{t} = \frac{dX_{t}}{dt},\mbox{ a.s.}.$$
The question is now : what can you say about the law of $Z$? Do you have any reference where we could find a result?
We would be very grateful for any feedback you can give us. Thank you very much.
For $n\in\mathbb N$, denote by $$ Y_t^n=\frac tn\sum_{k=0}^{n-1}X\left(\frac knt\right), $$ so that by continuity $$ Y_t^n\xrightarrow[n\rightarrow+\infty]{}Y_t, $$ almost surely. Now, $(Y_t^n)_{0\le t\le1}$ is a Gaussian process since a linear combination of the process is a linear combination of $(X_t)_{0\le t\le1}$. Thus, $(Y_t)_{0\le t\le1}$ is a Gaussian process since an almost sure limit of Gaussian random variables is Gaussian. Additionally, you can calculate $m_n$ the mean function of $Y_t^n$: $$ m_n(t)=\frac{t}{n} \sum_{k = 1}^n m\left(\frac{kt}{n}\right), $$ and the covariance function of $Y_t^n$: $$ W_n(s,t)=\frac{st}{n^2} \sum_{k = 1}^n \sum_{j = 1}^n W\left(\frac{sk}n,\frac{jt}n\right). $$ You can also check that since $(X_t)$ is continuous, $m$ and $W$ are necessarily continuous, and therefore $$ m_n(t)\xrightarrow[n\rightarrow+\infty]{}m^*(t),\text{ and }W_n(s,t)\xrightarrow[n\rightarrow+\infty]{}W^*(s,t). $$ Lastly, since $(Y_t,Y_s)$ is a Gaussian vector such that $(Y_t^n,Y_s^n)\to (Y_t,Y_s)$ almost surely, it follows that the moments converge, hence the mean function of $(Y_t)$ is given by $$ E[Y_t]=\lim_{n\rightarrow+\infty}m_n(t)=m^*(t), $$ and similarly for the covariance function.
$(Z_t)$ is also a Gaussian process. Define $$ Z_t^n=n\left(X\left(t+\frac1n\right)-X(t)\right). $$ By the same arguments as previously, $(Z_t^n)_{0\le t\le1}$ is a Gaussian process, with mean $$ m_n(t)=n\left(m\left(t+\frac1n\right)-m(t)\right)\xrightarrow[n\rightarrow+\infty]{}\frac{\mathrm dm(t)}{\mathrm dt}, $$ and covariance \begin{multline*} W_n(s,t)=n^2\left(W\left(s+\frac1n,t+\frac1n\right)-W\left(s+\frac1n,t\right)-W\left(s,t+\frac1n\right)+W\left(s,t\right)\right)\\ \xrightarrow[n\rightarrow+\infty]{}\frac{\partial^2W(s,t)}{\partial s\partial t}. \end{multline*} By the same arguments as in the first question, we show that $(Z_t)_{0\le t\le1}$ is a Gaussian process with mean function $\frac{\mathrm dm(t)}{\mathrm dt}$ and covariance function $\frac{\partial^2W(s,t)}{\partial s\partial t}$.
Remark: In order to show rigorously that $m$ is differentiable and $W$ twice differentiable, I think we need to assume that $(X_t)_{0\le t\le1}$ has $C^1$ trajectories almost surely.