Showing Markov property for "Brownian Motion"

54 Views Asked by At

In our lecture, we have defined "Brownian Motion" as a Gaussian process $X$ on $\mathbb{R}_+$ with covariance $C(t,s) = t \land s $, where the " " should indicate that this is indeed not the final definition of this notion. As an exercise, I have to show that this process satisfies the Markov property, in the sense that: $$\mathbb{E}[f(X_{t_N}) | \sigma(X_{t_{N-1}},...,X_{t_1})] = \mathbb{E}[f(X_{t_N}) |X_{t_{N-1}}] \text{ a.s.,}$$ where $f:\mathbb{R}\rightarrow \mathbb{R}$ is measurable and bounded, $N \in \mathbb{N}$ and $0 \leq t_1 < ... < t_N < \infty$. However, I am not quite sure how to do it.

This question was asked a few times before, however there they always used the proper definition of Brownian Motion, unlike in my case. The idea was to consider $Y_N = X_{t_N}-X_{t_{N-1}}$, then $$(Y_N, X_{t_{N-1}},...,X_{t_1})^T = A(X_N, X_{t_{N-1}},...,X_{t_1})^T,$$ where $A$ is easily seen to be an invertible matrix, which preserves "jointly Gaussian"-ness, hence the vector of the left is jointly Gaussian. Now it would be really helpful, if $Y_N$ is independent of $\sigma(X_{t_{N-1}},...,X_{t_1})$, because then we could do: $$\mathbb{E}[f(X_{t_N}) | \sigma(X_{t_{N-1}},...,X_{t_1})] = \mathbb{E}[f(X_{t_N}-X_{t_{N-1}}+X_{t_{N-1}}) | \sigma(X_{t_{N-1}},...,X_{t_1})] $$ $$ = \mathbb{E}[f(Y_N+X_{t_{N-1}}) | \sigma(X_{t_{N-1}},...,X_{t_1})] = \mathbb{E}[f(Y_N+X_{t_{N-1}})] $$ $$= \mathbb{E}[f(Y_N+X_{t_{N-1}})|X_{t_{N-1}}] = \mathbb{E}[f(X_{t_N}) |X_{t_{N-1}}]$$ In the proper definition of Brownian Motion, the increments are assumed to be independent, so the independence is contained in the definition, however in my case I don't quite see how one could prove it. Can someone give me a tip? Is this even the correct way to do it, or should I consider something else?