Ergodic Markov Chain

1k Views Asked by At

Suppose there exists an ergodic Markov chain with symmetric transition probabilities. For this Markov chain, why is the stationary distribution uniform?

2

There are 2 best solutions below

0
On

The row sums of a symmetric matrix are equal to the column sums. For a Markov transition matrix, the row sums are all equal to 1, so for a symmetric Markov transition matrix the column sums are also all equal to 1. But this means that the uniform distribution (a vector all of all ones, rescaled to add up to one) is a stationary distribution. Since the chain is ergodic, there is only one stationary distribution.

0
On

There is the intuition, and then there's the formal description. Here is the formal description:

  1. Note that for any matrix $M$, the product $[1,1,\ldots, 1]\cdot M$ computes a row vector containing the column sums of $M$.

  2. For any matrix of transition probabilities $T$, a stationary distribution is a row vector of probabilities $\pi$ such that $\pi \cdot T = \pi$.

    (When representing probabilities as a matrix, $T_{i,j}$ is equal to the probability of transitioning between states $i\rightarrow j$)

  3. For any matrix of transition probabilities $T$, the sum of the entries in each row is 1, because the total probability of transitioning somewhere is 1.

  4. If $T$ is moreover symmetric, its columns each sum to 1.

  5. Hence for a symmetric transition matrix, $[1, \ldots, 1] \cdot T = [1, \ldots, 1]$.

  6. We can freely scale both sides by $1/N$ to make $\pi$ a valid probability distribution: $$\frac{1}{N}[1,\ldots,1] \cdot T = \frac{1}{N}[1,\ldots,1]$$ $$\pi \equiv \frac{1}{N}[1,\ldots,1]$$

    $$\pi \cdot T = \pi $$

  7. So $\pi = \frac{1}{N}[1,\ldots, 1]$ is a stationary distribution of the symmetric matrix of transition probabilities $T$. If $T$ is ergodic, it only has one stationary distribution, so this must be it.