Markov Property as given in Norris' book on Markov chains

730 Views Asked by At

In the book, Markov Chains, the following theorem is mentioned:

Let $(X_n)_{n≥0}$ be Markov$(λ,P)$. Then, conditional on $X_m = i, (X_{m+n})_{n≥0}$ is Markov$( δ_i,P)$ and is independent of the random variables $X_0,...,X_m$.

I'm bit confused by the proof in the book which starts off:

Proof:

We have to show that for any event $A$ determined by $X_0,...,X_m$ we have: $$P( \{ X_m = i_m,...,X_{m+n} = i_{m+n} \} \cap A \; | \; X_m = i) = δ_{ii_m}p_{i_{m}i_{m+1}}...p_{i_{m+n-1}i_{m+n}}P(A \; | \; X_m = i ) \; $$

and then the result follows by Theorem $1.1.1$ (See below).

Theorem $1.1.1$ is: A discrete random process $(X_n)_{0 \leq n \leq N} $ is $Markov(\lambda, P) $ if and only if for all $i_0, \dots , i_N \in I $ $$ P(X_0 = i_0, X_1 = i_1, ... , X_N = i_N) = \lambda_{i_0}p_{i_0 i_1}p_{i_1 i_2}...p_{i_{N-1}i_N} $$

I don't really see how the result follows from $1.1.1$ after establishing that: $$P( \{ X_m = i_m,...,X_{m+n} = i_{m+n} \} \cap A \; | \; X_m = i) = δ_{ii_m}p_{i_{m}i_{m+1}}...p_{i_{m+n-1}i_{m+n}}P(A \; | \; X_m = i ) \; $$

Is there an easier way of doing it? Thanks in advance for the help.

1

There are 1 best solutions below

4
On BEST ANSWER

Assume that the last displayed equation is established. Define $Y_n:=X_{m+n}$. Taking $A:=\Omega$, we get $$P(\{Y_0=i_m,\dots,Y_n=i_{m+n}\}|Y_0=i)=\delta_{i,i_m}p_{i_m,i_{m+1}}\dots p_{i_{m+n-1},u_{i+n}}.$$ This gives conditional independence.