Markov property extends to all future times

181 Views Asked by At

I'm studying Pavliotis' Stochastic Processes and I am having trouble with one of the exercises.

Specifically the first exercise of chapter two is prove that the Markov property in the sense off the immediate future being independent of the past given the present, i.e. $P(X_{n+1} | X_1, \dots, X_n) = P(X_{n+1} | X_n)$ implies that arbitrary futures are independent of the past given the present, i.e. $P(X_{n+m} | X_1, \dots, X_n) = P(X_{n+m} | X_n)$.

Conceptually I imagine a proof using induction along the lines of: Assume the equality holds for some $m$, then \begin{gather} P(X_{n+m+1} | X_1, \dots, X_n) = \int P(X_{n+m+1}, X_{n+m} = x | X_1, \dots, X_n) dx \\ = \int P(X_{n+m+1} | X_{n+m} = x) P(X_{n+m} = x | X_1, \dots, X_n) dx \\ = \int P(X_{n+m+1} | X_{n+m} = x) P(X_{n+m} = x | X_n) dx \\ = \int P(X_{n+m+1}, X_{n+m} = x | X_n) dx = P(X_{n+m+1} | X_n) \end{gather}

However, this is based on intuition and my experience non-measure-theoretic probability theory. I cannot justify these steps when I think of $P$ as a measure and $P(X=x)$ meaning $P\{X^{-1}(x)\}$. In fact I have had trouble finding a definition of conditional probability that is at all helpful. Most authors seem to provide the abstract definition in terms of conditional expectation with respect to $\sigma$-algebra, but I haven't found any resources that show how to work with this definition.

So my question is: how (if it all) are these steps, particularly the first two equalities justified from a measure theoretic perspective?

1

There are 1 best solutions below

7
On

A Markov process $X_t$ is a stochastic process which can be described in terms of a transition operator $T_{s,t}$ where $X_t=T_{s,t}X_s$. For the particular case of Markov chains, the operator is a matrix.

For your original question, we are given $X_{n+1}=T_{n,n+1}X_n$ Therefore $X_{n+m}=T_{n.m}X_n$ where $T_{n,m}=T_{m-1.m}\times...\times T_{n,n+1}$ is the product of given operators.