restarting a Markov chain

244 Views Asked by At

I'm reading an article and having difficulty understanding some basic stochastic processes (I'm new to the subject so please pardon my wording of the question). Let $S$ be a set of states and $Q$ be a subset of $S$. Consider a Markov chain on $S$ and let's modify it to get a Markov chain on $Q$ as follows: if the Markov chain starts in $Q$, restart it uniformly in $Q$ whenever it exits $Q$. Let $A$ be the transition matrix of a Markov chain on $S$ and let $A_Q$ be the restriction of the transition matrix $A$ on $Q$. The article concluded that the ergodic distribution of the modified Markov chain is \begin{equation} \boldsymbol{1}_Q\cdot (\boldsymbol{1}_Q+A_Q+A_Q^2+\dots)\,\,\,\,\,\,\, (*) \end{equation} (in other words, the column sums of $(\boldsymbol{1}_Q-A_Q)^{-1}$).

My questions:

  1. What does it mean to restart a Markov chain uniformly?
  2. Is ergodic distribution the same as stationary distribution in this situation?
  3. How to derive the formula $(*)$?

Thanks in advance!