Limiting Distribution of a Markov Chain

1.1k Views Asked by At

I'm having trouble understanding how to find a limiting distribution. If I have a Markov Chain whose transition probability matrix is: $$ \mathbf{P} = \matrix{~ & 0 & 1 & 2 & 3 & 4 \\ 0 & q & p & 0 & 0 & 0 \\ 1 & q & 0 & p & 0 & 0 \\ 2 & q & 0 & 0 & p & 0\\ 3 & q & 0 & 0 & 0 & p \\ 4 & 1 & 0 & 0 & 0 & 0 } $$

where p>0, q>0 and p+q=1

How would I go about finding the limiting distribution? Thanks for any and all help!

2

There are 2 best solutions below

1
On

HINT: Diagonalize the matrix $\mathbf{P}$.

0
On

There is no stable limiting distribution. The transition matrix is not diagonalizable.


The Markov chain has this state diagram:

enter image description here

Now suppose that at an arbitrarily large number of steps later, the chain is in state $0$ with probability $a$, state $1$ with probability $b$, state $2$ with probability $c$, state $3$ with probability $d$, and state $4$ with probability $e$, and suppose, for contradiction, that this is a constant limiting distribution.

Then

$$a = bp + e$$

$$b = aq$$

$$c = bq$$

$$d = cq$$

$$e = dq$$

Substitution gives $e = aq^4$, so

$$a = aqp + aq^4 \Rightarrow 1 = qp + q^4$$

We also know $p+q=1$, so

$$1 = q(1-q)+q^4$$

$$q^4 -q^2 + q - 1$$

$$q^2(q+1)(q-1)+1(q-1)=0$$

Since $q \neq 1$,

$$q^3 +q^2 +1 = 0$$

This has no positive solutions for $q$ (by inspection). Therefore, there is no stable limiting distribution.