I've been given the following definition:
For a THMC with one step transition matrix $\mathbf{P}$, the row vector $\mathbf{\pi}$ with elements $(\pi_{i})_{i \in S}$ (where $S$ is the state space) is a stationary distribution iff $\mathbf{\pi \; P} = \mathbf{\pi}$
However, I also know that many THMCs will have multiple stationary distributions.
This leaves me with the following questions:
- How can you tell how many stationary distributions a THMC has?
- How can I show that a THMC has only one stationary distribution?
- The equation $\mathbf{\pi \; P} = \mathbf{\pi}$ looks like it should only have one solution, so how is it possible to have multiple stationary distributions?
Stationary Distributions:
Let $\mathbf{P}$ be the transition probability matrix of a homogeneous Markov chain $\{X_n, n \geq 0\}$. If there exists a probability vector $\mathbf{\pi}$ such that $$\mathbf{\pi} \mathbf{P} = \mathbf{\pi} \:\:\:\:\:\:\: (1)$$
then $\mathbf{\pi}$ is called a stationary distribution for the Markov chain. Equation $(1)$ indicates that a stationary distribution $\mathbf{\pi}$ is a (left) eigenvector of $\mathbf{P}$ with eigenvalue $1$. Note that any nonzero multiple of $\mathbf{\pi}$ is also an eigenvector of $\mathbf{P}$. But the stationary distribution $\mathbf{\pi}$ is fixed by being a probability vector; that is, its components sum to unity.
Limiting Distributions:
A Markov chain is called regular if there is a finite positive integer $m$ such that after $m$ time-steps, every state has a nonzero chance of being occupied, no matter what the initial state. Let $A > 0$ denote that every element $a_{ij}$ of $A$ satisfies the condition $a_{ij} > 0$. Then, for a regular Markov chain with transition probability matrix $\mathbf{P}$, there exists an $m > 0$ such that $\mathbf{P}^m > 0$. For a regular homogeneous Markov chain we have the following theorem: