How to prove a Markov chain is irreducible in general?

1k Views Asked by At

Consider the following example:

Given a random variable $X_n$ is $Markov(\pi,P)$, where $P$ is irreducible and has an stationary distribution $\pi$.

Then I can show that the time reversal of $X_n$, $Y_n=X_{N-n}$ is also a Markov chain. with transition probability $$P(Y_{n+1}=j|Y_{n}=i)=\hat{p}_{ij}=\frac{\pi_jp_{ji}}{\pi_i}$$

However, is the Markov chain for $Y_n$ also irreducible?

My reasoningss:

From definition, In order to prove irreducibility I need to show that

$$P(Y_n=j | Y_0=i)=\hat{p}^{(n)}_{ij}>0$$ for some $n>0$

The only thing I can think of now is

\begin{align} P(Y_n=j | Y_0=i)&=\hat{p}_{ik_1}\hat{p}_{k_1k_2}\hat{p}_{k_2k_3}...\hat{p}_{k_{n-1}j}\\& = \frac{\pi_{k_1}p_{k_1i}}{\pi_i}\frac{\pi_{k_2}p_{k_2k_1}}{\pi_{k_1}}\frac{\pi_{k_3}p_{k_3k_2}}{\pi_{k_2}}...\frac{\pi_{j}p_{jk_{n-1}}}{\pi_{k_{n-1}}}\\ &=\frac{\pi_{j}p^{(n)}_{ji}}{\pi_{i}}\end{align}

Since the Markov chain for $X_n$ is irreducible, we know that $p^{(n)}_{ji}>0$ for some $n$. However, I cannot argue $\pi_i$ and $\pi_j$ are always greater than $0$, so we are not 100% sure that $P(Y_n=j | Y_0=i)=\hat{p}^{(n)}_{ij}>0$, since it is possible that $\pi_i=0$ or $\pi_j=0$ for some $i$ and $j$.

So, how do we show that a Markov chain is irreducible in general? What am I missing here?