I just read the chapter 4.1 in the Gardiner. He defines that the so called "white noise" $\xi$ has the properties: $\left<\xi\right> = 0$, $\xi(t)$ is independent of $\xi(t')$ for $t\neq t'$. The last property also implies that $\left<\xi(t)\xi(t')\right> = \delta(t-t')$. Now he defines $u(t')$ as
$$u(t') = \int_0^{t_0}dt' \xi(t').$$
He claims that assuming that $u(t)$ is continuous, $u(t)$ is a Markov Process. The argument goes as follows:
$$u(t') = \lim_{\epsilon\rightarrow0}\left[ \int_0^{t-\epsilon} \xi(s) ds\right] + \int_t^{t'}ds \xi(s).$$
- For any $\epsilon>0$ the $\xi(s)$'s in the first integral are independent of the $\xi(s)$'s in the second.
(OK, so far I agree.)
- Hence by continuity $u(t)$ and $u(t')-u(t)$ are statistically independent.
(I also agree on that. It follows from the preceding argument by noting that as $u(t) = \lim_{\epsilon\rightarrow0}\left[ \int_0^{t-\epsilon} \xi(s) ds\right]$ and $u(t')-u(t) = \int_t^{t'}ds \xi(s)$)
- Further $u(t') - u(t)$ is independent of $u(t'')$ $\forall t''<t$.
(nothing new, it follows from point 1)
- This means that $u(t')$ is fully determined (probabilistically) from the knowledge of the values $u(t)$ and not by any past value. Hence, $u(t)$ is a Markov process.
I don't get the last point. Why does this follow?
It follows from the definition of a Markov process (Gardiner)
(ordered times). In our case we have to demonstrate that for $t''<t<t'$
$p(u(t'),t'|u(t),t;u(t''),t'')=p(u(t'),t'|u(t),t)$
From Gardiner
As you stated the integral is independent from u(t) and u(t'') for every $t''<t$, this means that knowing that at time t'' the value of u was u(t'') it doesn't matter since u(t') is only given by u(t) added to something that doesn't depend on u(t'') (remember this is true for every $t''<t$). So in the conditional probability you can just remove the last term.