Inverse of a Markov chain is always a Markov chain?

499 Views Asked by At

I'm aware that a Markov chain with a stationary distribution has an inverse which is a Markov chain. But I suspect this to be false in general. Can someone please provide a counterexample?

1

There are 1 best solutions below

3
On BEST ANSWER

Let $X_t$ be a Markov chain on a discrete state space

$$\mathbb{P}[X_t = x \vert X_{t+1} = y, X_{t+2} =z] = \frac{\mathbb{P}[X_t = x ,X_{t+1} = y, X_{t+2} =z]}{\mathbb{P}[X_{t+1} = y, X_{t+2} =z]}=\\ \frac{\mathbb{P}[ X_{t+2} =z \vert X_{t+1} = y,X_t = x ]\mathbb{P}[X_{t+1} = y,X_t = x]}{\mathbb{P}[X_{t+1} = y, X_{t+2} =z]}\\= \frac{\mathbb{P}[ X_{t+2} =z \vert X_{t+1} = y]\mathbb{P}[X_{t+1} = y,X_t = x]}{\mathbb{P}[X_{t+1} = y, X_{t+2} =z]} =\frac{\mathbb{P}[X_{t+1}=y,X_t=x]}{\mathbb{P}[X_{t+1}=y]} = \mathbb{P}[X_t = x \vert X_{t+1}=y]$$

like this you get the markov property for the reverse process.

But then even if $X_t$ is time-homogenous, that is $\mathbb{P}[X_{t+1} = y \vert X_t = x] = p(y \vert x)$, then you still have $$\mathbb{P}[X_t = x \vert X_{t+1}=y] = \frac{p(y \vert x) \mathbb{P}[X_t=x]}{\mathbb{P}[X_{t+1}=y]},$$ which can depend on time.