Required number of transitions to reach steady state of Markov chain

351 Views Asked by At

Consider a Markov chain with initial condition $I$ and transition matrix $T$. The steady state condition $S$ solves the equation $ST=S$. But is it possible to compute the minimum number of transitions which result in the steady state condition? In other words, how to compute $n$ such that $IT^n=S$?