A class property of Markov Chain is periodicity. But I do not understand how is to calculate the period of a state from a transition probability matrix.
I am following the book "An Introduction to Stochastic Modeling" by Howard M. Taylor and Samuiel Karlin.
In the book there are some examples on periodicity of a Markov Chain but I don't understand. Following I am giving two examples :
$(1)$ In a finite Markov chain of $n$ states with transition matrix $$P= \begin{bmatrix} 0 & 1 & 0&0&\ldots&0 \\ 0 &0 & 1&0&\ldots&0 \\ \vdots\\ 0 & 0 &&&\ldots&1 \\ 1 & 0 & 0&&\ldots&0 \\ \end{bmatrix}, $$
each state has period $n$. But how can I understand that it has period $n$?
$(2)$ A Markov chain has the transition probability matrix
$$P= \begin{bmatrix} 0 & 0 & 0.8 &0.2\\ 0 & 0 & 0.4 &0.6\\ 0.7 & 0.3 & 0 &0\\ 0.2 & 0.8 & 0 &0\\ \end{bmatrix}. $$
All states are periodic with period $2$. But why is the period $2$ here ?
Any help is appreciated. Thank you.
a) In the first case, if we try to make the respective graph, we will end up with a circle graph, i.e. the transition is of the form: $$1\to 2\to \cdots \to (n-1)\to n\to 1\to \cdots .$$ Assuming that we start from any state $i,\,i = 1,\ldots, n$ we can reach the same state $i$ after exactly $k\cdot n,\, (k \in \mathbb Z^+)$ steps and considering the definition of the period of a state we have that the period $d$ for the state $i$ is $d_i = n,\, i = 1,\ldots,n$ (basically, it's an irreducible markov chain, so all the states will have the same period).
b) We can observe that we have an irreducible Markov chain, thus every state is going to have the same period. Notice that we have the following pattern: $$\{1,2\}\to \{3,4\}\to\{1,2\}\to\{3,4\}\to \cdots.$$ What does this tell you about the periodicity of each state?