Markov chain steady state existence

2.7k Views Asked by At

Is it possible for a Markov chain to have no steady state solution ? What is an example of such system ?

2

There are 2 best solutions below

2
On BEST ANSWER

A finite Markov chain always has at least one steady-state distribution. If the transition matrix is $A$, each column of $A-I$ sums to $0$, so $A-I$ doesn't have full rank, and there is at least one nontrivial solution to $Ax=x$.

On the other hand a Markov chain with an infinite state space doesn't have to have a steady-state distribution. For example, consider the chain with state space $\{1,2,3,4,\ldots\}$, where state $k$ transitions to state $k+1$ with probability $1$.

1
On

In the excellent discussion of Markov chains in Feller's "Introduction to Probability Theory and Applications," the term "steady state" appears nowhere. If your definition coincides whith "persistent state", then that means that the probability $f_{ii}$, given an initial state of $i$, of returning to $i$ eventually, is 1. More precisely, $$ \sum_{n=1}^{\infty}f^{(n)}(i) = 1 $$ where $f^{(n)}(i)$ is the probaility or a first return to state $i$ at time $n$.

The contrasting term to "persistent state" is transient state. My own example of a Markov chain with no persistent state at all is as follows:

Consider a chain whose transition map is $$ M : [0,1) \mapsto [0,1) : M(x) = \{ x\sqrt{2}\} $$ where the notation $\{w\}$ means the fractional part of $w$, that is, $w-\lfloor w \rfloor$.

The states of $M$ are the reals on the half-open interval $[0,1)$. No state will every repeat exactly, so there is no persistent ("steady") state.