Why is the following a Markov Chain?

246 Views Asked by At

I have a question regarding a Markov Process. I have tried some things but it didn't lead to anything. The question is:

Let $(X_n)_{n\geq0}$ be a Markov process with transition matrix $P$. Assume that the process does not have any absorbing states. Define a sequence $(S_m)_{m\geq0}$ of $\mathbb{N} \cup \{0\}$-valued random variables by $S_0 = 0$ and

$S_{m+1} =$ inf $\{n \geq S_n : X_n \neq X_{S_m}\}$.

Question: Show that the process $(Z_m)_{m\geq0}$ defined by $Z_m=X_{S_m}$ is a Markov chain and compute its transition probabilities (in terms of the entries of $P$).

I hope anyone can help me/give tips! Thanks

1

There are 1 best solutions below

0
On BEST ANSWER

So, we have $P$ the transition probability matrix of the Markov chain $X_n$. Let's denote with $P^S$ the transition probability matrix of the Markov Chain $X_{S_n}$. This answer will not touch upon proving that $X_{S_n}$ is Markov (in my opinion, that is a fairly trivial question).

As noted in the comments, the only difference between $X_n$ and $X_{S_n}$ is that in the latter self-transitions are not allowed to happen. Think about why this is the case by looking at an example and computing $X_{S_n}$. So, $P^S_{ii} = 0$.

To compute $P^S_{ij}$ for $i\neq j$, there are two methods:

  1. Using a symmetry argument, we have $P^S_{ij} = c P_{ij}$ for a constant $c>0$. Now use that all probabilities in the same row needs to add up to $1$. Hence, we have $P^S_{ij} = P_{ij}/(1-P_{ii})$.
  2. Directly from the definition $P^S_{ij} = P_{ij} + P_{ii}P_{ij} + P_{ii}^2 P_{ij} + \cdots = P_{ij}/(1-P_{ii})$.