Is, with $(x_n)$ a Markov chain, $(n,x_n)$ always a time homogeneous markov chain?

174 Views Asked by At

Someone told me that this discrete stochastic process $(n,x_n)$ is always a time homogeneous Markov chain (where $(x_n)$ is a discrete Markov chain).

What do you think of that proposition ? It seems false to me, because I don't see how adding $n$ to the process makes it T.H.

Otherwise, do you know how to transform any Markov chain into a time homogeneous one ?

1

There are 1 best solutions below

8
On BEST ANSWER

Every state in the new Markov chain has the time included in the state itself. Therefore, the transition probabilities only depend on the states.

That's what makes it time-homogeneous, but in a rather brutish way. For instance, it will turn a finite-state markov chain into an infinite-state markov chain, and every state in the new chain is transient because they can be visited at most once.