Suppose that $(X_n)_{n≥0}$ is Markov$(λ, P)$ but that we only observe the process when it moves to a new state. Defining a new process as $(Z_m)_{m≥0}$ as the observed process so that $Z_m := X_{S_m}$ where $S_0 = 0$ and for $m ≥ 1$ $$S_{m+1}= \inf\{n \geq S_m : X_n \neq X_{S_m}\}$$
Assuming that there are no absorbing states and using the Strong Markov Property i want to show that $(Z_m)_{m≥0}$ is a Markov chain and why the transition probabilities of the $(Z_m)_{m≥0}$ chain for $i\neq j$ are given by
$$\overline p_{ij} = \frac{P_{ij}}{\sum_{k\neq i}p_{ik}}$$
Here's how I try to solve this:
Firstly, $S_m$'s are stopping times for every $m\geq 0$.
Next,
\begin{align} &\hphantom{{}={}}\Bbb P(Z_{m+1}=i_{m+1} | Z_0=i_0,...,Z_m=i_m) \\ &=\Bbb P(X_{S_{m+1}}=i_{m+1} | X_{S_0}=i_0,..., X_{S_m}=i_m) \\ &=\Bbb P_{i_m}(X_{S_1}=i_{m+1}) =\overline P_{i_mi_{m+1}} \qquad \text{(by Strong Markov Property)} \end{align}
Where $\overline P_{ij}=\Bbb P_i($Next visit to $J$ is state $j)$, which it the smallest solution to the system of linear equations (link to the definition and explanation of what J means here) $$\overline P_{ij}= P{ij}+ \sum_{k \neq j}P_{ik}\overline P_{kj}$$
So far I have proved that it's a Markov. Where I get stuck is showing the transition probabilities:
$$\overline p_{ij}= p_{ij} + \sum_{k\neq i}p_{ik} \overline p_{kj}$$
How is it supposed to be in final form(which i'm struggling to deduce) :$$\overline p_{ij} = \frac{P_{ij}}{\sum_{k\neq i}p_{ik}}$$
Any idea? Any help is appreciated