Markov chain equivalent definitions

207 Views Asked by At

Reading the book "Probability and random processes" by Geoffrey R. Grimmett and David R. Stirzaker I found this evil exercise to the reader:

The problem

Let $\{X_n\}_{n=1}^{\infty}$ be a sequence of random variables such that $X_i:\Omega\to S$, where $S$ is a countable set. Prove that the following conditions are equivalent:

  1. For all $n\geq 1$ and all $s,x_1,\dotsc,x_{n-1}\in S$

\begin{equation} P(X_{n}=s|X_{0}=x_{0},\dotsc,X_{n-1}=x_{n-1})=P(X_{n}=s|X_{n-1}=x_{n-1})\text{. } \end{equation}

  1. For every $s\in S$ and every sequence $\{x_i\}_{i=1}^{\infty}\subset S$

\begin{equation} P(X_{n+1}=s|X_{n_1}=x_{n_1},\dotsc,X_{n_k}=x_{n_k})=P(X_{n+1}=s|X_{n_k}=x_{n_k}) \end{equation} for all $n_1<\dots<n_k\leq n$.

  1. For any $m$, $n\geq 0$

\begin{equation} P(X_{m+n}=s|X_{0}=x_{0},\dotsc,X_{m}=x_{m})=P(X_{m+n}=s|X_{m}=x_{m})\text{. } \end{equation}

My thoughts so far

(3) with $m:=m'-1$ and $n:=1$ is just (1), so clearly (3) implies (1).

(2) with $n+1:=n'+m'$ and $n_i:=i$ for all $i\in\{0,\dotsc,m'\}$ is just (3), so clearly (2) implies (3).

The only thing left to do is to prove that (1) implies (2), but I have no idea how to do it. I have only tried the basic "tricks" with the definition of conditional probability with no success, but I am clueless beyond that.

1

There are 1 best solutions below

0
On

HINT: $\Omega=\bigcup\limits_{z \in S}\{X_n=z\}$ and denote $P_{x_{n_1}\dots x_{n_k}}\colon = P(\ \cdot \ |X_{n_k}=x_{n_k},...,X_{n_1}=x_{n_1})$.

Therefore: $P_{x_{n_1}\dots x_{n_k}}(X_{n+1}=s)=\sum\limits_{z\in S}P_{x_{n_1}\dots x_{n_k},z}(X_{n+1}=s)P_{x_{n_1}\dots x_{n_k}}(X_n=z)$

Go backwards until you have all missing indices