Definition of limiting distribution in a Markov chain -- why do we condition on the initial state?

387 Views Asked by At

Given a Markov chain $\{X_n \mid n \in \{0, 1, \ldots\}\}$ with states $\{0, \ldots, N\}$, define the limiting distribution as $$ \pi = (\pi_0, \ldots, \pi_N) $$ where $$ \pi_j = \lim_{n \to +\infty} \mathbb{P}\{X_n = j \mid X_0 = i\} $$

I am confused as to why we condition on $X_0 = i$. What kind of a role does the initial state play? My textbook offers no explanation.

1

There are 1 best solutions below

4
On BEST ANSWER

Consider the following system, with $N=3$ :

$$M= \begin{bmatrix} 0.5 & 0.5 & 0 & 0 \\ 0.4 & 0.6 & 0 & 0 \\ 0 & 0 & 0.3 & 0.7 \\ 0 & 0 & 0.2 & 0.8 \\ \end{bmatrix}$$ where $$M_{i, j} = \mathbb{P}\left\{X_n = j \mid X_{n-1} = i\right\}$$

Notice that $$M^\infty = \begin{bmatrix} \frac 49 & \frac 59 & 0 & 0 \\ \frac 49 & \frac 59 & 0 & 0 \\ 0 & 0 & \frac 29 & \frac 79 \\ 0 & 0 & \frac 29 & \frac 79 \\ \end{bmatrix}$$

Can you say that $M^\infty\,_{i,j}$ doesn't depend on $i$?