Stochastic processes understanding of probabilty measure

69 Views Asked by At

Consider a Markov chain $\{X_n,n \geq 0\}$ with finite state space $S = \{1,2,\ldots ,m\}$ and transition probability matrix $P = (P_{ij})_{i,j\in S}$. Let $P_{ij}^n$ be the probability that the process in state $i$ is in state $j$ after $n$ transitions. Suppose that for any $i,j \in S$, the limit $\lim_{n\to \infty} P_{ij}^n = \pi_j > 0$ exists and is independent of $i$. Prove that the probability measure $\pi = (\pi_1, \pi_2, \ldots, \pi_m)$ satisfies $\pi P = \pi$. By Chapman Kolmogorov I proved that $\pi_j= \sum_{i=1}^m P_{ij}^k \pi_i$ but am unsure where to go from here.

1

There are 1 best solutions below

2
On

It suffices to note that $$ (\pi P)(j) = \sum_{k=1}^m \left(\lim_{n \to \infty} P_{ik}^n\right) P_{kj} = \left(\lim_{n \to \infty} \sum_{k=1}^mP_{ik}^nP_{kj}\right) = \lim_{n \to \infty} P^{n+1}_{ij} = \lim_{n \to \infty} P^{n}_{ij} = \pi(j). $$