Let $\lbrace X_n,\;n=0,1,2,\ldots\rbrace$ be a three-state Markov chain with $\mathcal{S}=\lbrace 0,1,2 \rbrace$ and the transition probability matrix $\mathbf{P} = \left[\begin{array}{ccc} 0.4 &0.2 &0.4\cr 0.3 &0.7 &0\cr 0.8 &0 &0.2 \end{array}\right]$
State 0 represents an operating state of some system, while states 1 and 2 represent repair states (corresponding to two types of failures). We assume that the process begins in state $X_0 = 0$, and then the successive returns to state 0 from the repair state form a renewal process. Determine the mean duration of one of these renewal intervals.
I have tried following along with the answer in Mean duration of the renewal intervals but I am having trouble applying this to a markov chain with 3 or more states like the one above. I would appreciate any help with this.