Expected time in Markov chain

52 Views Asked by At

Consider two states, State $1$ and State $2$. The probability that you go from State $1$ to State $2$ in one day is $20\%$. The probability that you go from State $1$ to State $1$ (stay in the same place) is $80\%$. The probability that you go from State $2$ to State $1$ is $50\%$, and the probability that you go from State $2$ to State $2$ is $50\%$. In $252$, how many times do we expect to visit State $1$?

I drew a Markov chain, and I got the transition matrix:

$$T := \begin{pmatrix} 0.8 & 0.5 \\ 0.2 & 0.5\end{pmatrix} $$

Then I took large matrix powers and found

$$\lim_{k\to\infty} T^{k} = \begin{pmatrix}5/7 & 5/7 \\2/7 & 2/7 \end{pmatrix} $$

Does this mean that my answer is just $(5/7) \cdot 252 = 180$?