Show that this MC is ergodic?

204 Views Asked by At

Suppose I have a Markov Chain with States, $S = {1,2,3,4}$ and a PTM given by

$P =$ $\begin{pmatrix} .25 & .25 & .25 & .25 \\ 1 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0 \end{pmatrix}$

Is this chain ergodic? Why?

With $\mu = ( \mu_1, \mu_2, \mu_3, \mu_4)$, How can I find the invariant distribution by using $\mu = \mu*P$ and $\mu_1 + \mu_2 + \mu_3 + \mu_4 = 1$?

2

There are 2 best solutions below

0
On

We see that it is possible to get to state $1$ from states $2,3,$ and $4$ by using the path $$4 \rightarrow 3 \rightarrow 2 \rightarrow 1.$$ We can also transition from $1$ to $1$ With non-zero probability.

there is non-zero probability of transitioning from $1$ to $2$ and we can follow the path

$$4 \rightarrow 3 \rightarrow 2$$

to get from $4$ or $3$ to $2$. We also see that we can get back to $2$ using $2 \rightarrow 1 \rightarrow 4 \rightarrow 3 \rightarrow 2$, and thus it is possible to get to $2$ from every other state. We see that there is non-zero probability of transitioning from $1$ to $3$ and we can take the path $2 \rightarrow 1 \rightarrow 3$ to get from $2$ to $3$ and we can take the path $4 \rightarrow 3$ to get from $4$ to $3$. We can also get back to $3$ from $3$ using the path $3 \rightarrow 2 \rightarrow 1 \rightarrow 3.$ Finally, we can take the path $$3 \rightarrow 2 \rightarrow 1 \rightarrow 4$$ to get from every other state to $4$, and we can use the path $4 \rightarrow 3 \rightarrow 2 \rightarrow 1 \rightarrow 4$ to get from $4$ back to $4$. Thus, since with non-zero probability state we can go from every state to every state, this Markov Chain is ergodic.

We are looking from $\mu = (\mu_1,\mu_2,\mu_3,\mu_4)$ such that $$\mu = \mu \cdot P = (.25 \mu_1 + \mu_2,.25\mu_1 + \mu_3, .25\mu_1 + \mu_4,.25 \mu_1).$$

This is essentially solving a system of linear equations with the requirement that $\sum_{i=1}^4 \mu_i = 1.$ We see that $$\mu_4 = .25 \mu_1$$ and $$\mu_3 = .25\mu_1 + \mu_4 = .5 \mu_1$$ and $$\mu_2 = .25\mu_1 + \mu_3 = .25\mu_1 + .5\mu_1 = .75\mu_1$$ and, of course, $$\mu_1 = .25\mu_1 + \mu_2 = \mu_1.$$ We see that $$\mu_1 + \mu_2 + \mu_3 + \mu_4 = 1$$ $$\implies \mu_1 + .75\mu_1 + .5\mu_1 + .25\mu_1 = 2.5\mu_1 = 1$$ $$\implies \mu_1 = \frac{1}{2.5} = \frac{2}{5}.$$

The rest are simple calculations.

0
On

A Markov chain is ergodic if it is not periodic. Specifically, a period $k$ of an irreducible chain is defined as $k=gcd\{n: Pr(X_n = i |X_0 =i ) >0 \}$. Since state 1 can move to it self, then the $gcd=1$, which means the chain is not periodic, and thus, ergodic. Moreover, if we begin with a stationary distribution upon the states, then this is an ergodic process.