Show that if $\{X_n\}$ is a Markov Chain

1k Views Asked by At

Show that, if $\{X_n\}$ is a Markov Chain then $$P(X_n=j\mid X_k=l,X_m=i)=P(X_n=j\mid X_m=i),0\leq k<m<n$$

What I did is

$$P(X_n=j\mid X_k=l,X_m=i)=\frac{P(X_n=j,X_k=l,X_m=i)}{P(X_k=l,X_m=i)}=\frac{\sum_{i_{n-1},\dots,i_0}P(X_n=j,X_{n-1}=i_{n-1},\dots,X_{m+1}=i_{m+1},X_m=i,\dots,X_k=l,\dots,X_0=i_0)}{P(X_k=l,X_m=i)}$$

$$=\frac{\sum_{i_{n-1},\dots,i_0}P(X_n=j,X_{n-1}=i_{n-1}, \dots,X_{m+1}=i_{m+1}\mid X_m=i,\dots,X_k=l,\dots,X_0=i_0)P(X_m=i,\dots,X_k=l,\dots,X_0=i_0)}{P(X_k=l,X_m=i)}$$

$$=P(X_n=j\mid X_m=i)*\frac{\sum_{i_{n-1}, \dots,i_0}P(X_m=i,\dots,X_k=l,\dots,X_0=i_0)}{P(X_k=l,X_m=i)}=P(X_n=j\mid X_m=i)$$

Is it right? Is there an easier way?

1

There are 1 best solutions below

0
On BEST ANSWER

Yes. That's it.

A Markov chain is a sequence of random variables $\{X_k\}_{k\in\{0..n\}}$ representing $n+1$ subsequent states of a system, such that for all supported values $\{i_k\}_{k\in\{0..b\}}$, and $i_c$, where $0\leq a< b< c\leq n$ we have:

$$\mathsf P(X_c=i_c\mid \bigcap_{k\in\{0..b\}} X_k=i_k)= \mathsf P(X_c=i_c\mid X_b=i_b)$$

So if we let: $K =\{0..a-1\}\cup\{a+1..b-1\}$, then:

$$ \begin{align} & \mathsf P(X_c=i_c\mid X_b=i_b, X_a=i_a) \\[2ex] = & \sum_{\{i_k\}_{k\in K}} {\mathsf P(X_c=i_c\mid X_b=i_b, X_a=i_a,\bigcap_{k\in K}X_k=i_k)\;\mathsf P(\bigcap_{k\in K}X_k=i_k)} \\[2ex] = & \mathsf P(X_c=i_c\mid X_b=i_b)\sum_{\{i_k\}_{k\in K}} \mathsf P(\bigcap_{k\in K}X_k=i_k) \\[2ex] = & \mathsf P(X_c=i_c\mid X_b=i_b) \end{align} $$

Which is what you said; but perhaps a little more compactly.