Markov property for a Stochastic Process

331 Views Asked by At

My question: Every Stochastic Process $X(t), t\geq 0$ with space states $\mathcal{S}$ and independent increments has the Markov property, i.e, for each $\in \mathcal{S}$ and $0\leq t_0\leq< t_1<\cdots <t_n<\infty$ we have $$ P[X(t_n)\leq y|X(t_0),X(t_1), \ldots, X(t_{n-1})] =P[X(t_n)<y|X(t_{n-1})] $$

This theorem is a statement in the Kannan's book, An Introduction to Stochastic Processes, in the page 93, There is a sketch of proof, but for me that I'm beginner, I would say this proves intelligible. I would like to see a detailed proof, or a good reference on this theorem.

2

There are 2 best solutions below

2
On

Theres not much to this statement: $P[X(t_n) \leq y | X(t_0), \ldots, X(t_{n-1})] = P[X(t_n) - X(t_{n-1}) \leq y - X(t_{n-1}) | X(t_0), \ldots, X(t_{n-1})] = P[X(t_n) - X(t_{n-1}) \leq y - X(t_{n-1}) |X(t_{n-1}), X(t_{n-1})- X(t_{n-2}), \ldots, X(t_1) - X(t_0)] = P[X(t_n) - X(t_{n-1}) \leq y - X(t_{n-1}) |X(t_{n-1})] = P[X(t_n) \leq y | X(t_{n-1})]$ where we rewrote $X(t_0),\ldots,X(t_{n-1})$'s information in terms of the corresponding increments and then used the independent increments property.

0
On

For every nonnegative $s$ and $t$, let $X^t_s=X_{t+s}-X_t$, then the hypothesis is that the processes $X^t=(X^t_s)_{s\geqslant0}$ and ${}^t\!X=(X_s)_{s\leqslant t}$ are independent.

For every $s\geqslant t$, $X_s=X_t+X^t_{s-t}$ hence the process $(X_s)_{s\geqslant t}$ is a deterministic function of the random variable $X_t$, which is ${}^t\!X$-measurable, and of the process $X^t$, which is independent of ${}^t\!X$. Thus, the conditional distribution of the future $(X_s)_{s\geqslant t}$ conditionally on the past ${}^t\!X$ depends only on the present $X_t$. This is Markov property.