Proving a particular Markov Chain has a stationary distribution

122 Views Asked by At

So far when proving that a stationary distribution of a small Markov Chain exits, I have simply been eyeballing it by seeing if all states communicate (making it irreducible) and that it is possible to go from state n to state n in 1 step (all entries on the diagonal are positive/it is aperiodic).

But I have come across Markov Chains where it isn't aperiodic, yet it still possesses a stationary distribution such as:

P = \begin{array}{l}0.2&0.2&0.6\\0.3& 0&0.7\\0.5&0.1&0.4\end{array}

You can see here that it is not possible to go from state 2 to state 2 in 1 step, so P is not aperiodic, yet it still has a stationary distribution.

Can the existence of a stationary distribution also be proved via the eigenvalues or by proving that P has all positive entries? How could I go about doing this?

I know how to actually find the stationary distribution/calculate it, I am just struggling with explaining its existence without resorting to it being aperiodic and irreducible

1

There are 1 best solutions below

0
On BEST ANSWER

First off that chain is in fact aperiodic. You can't go from 2 to 2 in one step but you can go in 2 or 3 steps so the GCD of all path lengths from 2 to 2 is 1.

Second, aperiodicity is only needed to guarantee convergence to the stationary distribution irrespective of the starting distribution. Being irreducible on a finite state space is already sufficient to guarantee a stationary distribution exists and is unique.