$n$-step transition probability of a Markov chain

1.1k Views Asked by At

Let $(X_t)_{t\in\mathbb{N}_0}$ be a time-homogenous Markov chain over a finite state space $\left\{1,\dots,m\right\}$, so that we've got $$\Pr(X_{t+1}=j\mid X_t=i_t,\dots,X_0=i_0)=\Pr(X_{t+1}=j\mid X_t=i_t)$$ The probability vectors $\left(\pi^{(t)}\right)_{t\in\mathbb{N}_0}$ with$$\pi_i^{(t)}:=\Pr(X_t=i)$$ correspond immediately to the probability distributions of the $X_t$. The transition probabilities $$\Pr(X_{t+1}=j\mid X_t=i)=\Pr(X_1=j\mid X_0=i)=:p_{ij}$$ are put into a transition Matrix $M=\left(p_{ij}\right)_{m\times m}$. It's easy to see that we've got $$\pi^{(t+1)}=\pi^{(0)}M^{t+1}$$ But: I don't understand why it holds $$\Pr(X_n=j\mid X_0=i)=\left(M^n\right)_{ij}$$ From definition of matrix multiplication we've got $$\left(M^2\right)_{ij}=\sum_{k=1}^mp_{ik}p_{kj}=\sum_{k=1}^m\Pr(X_1=k\mid X_0=i)\Pr(X_1=j\mid X_0=k)$$ for $n=2$. Why is this equal to $\Pr(X_2=j\mid X_0=i)$?

1

There are 1 best solutions below

0
On BEST ANSWER

Why is this equal to $\Pr(X_2=j\mid X_0=i)$?

Because $$\Pr(X_1=j\mid X_0=k)=\Pr(X_2=j\mid X_1=k)=\Pr(X_2=j\mid X_1=k,X_0=i),$$ first by stationarity, then by the Markov property at time $1$, hence $$\Pr(X_1=j\mid X_0=k)\Pr(X_1=k\mid X_0=i)=\Pr(X_2=j,X_1=k\mid X_0=i),$$ in particular, $$\sum_k\Pr(X_1=j\mid X_0=k)\Pr(X_1=k\mid X_0=i)=\Pr(X_2=j\mid X_0=i).$$