For a finite state Markov chain, a state is recurrent iff it is essential

340 Views Asked by At

Let $P$ be the transition matrix of a Markov chain on a finite state space $\mathscr{X}$. A state $x \in \mathscr{X}$ is called essential if for all $y$ such that $x \to y$ it is also true that $y \to x$. For finite chains, a state $x$ is essential if and only if $$P_x\{ \tau_x^+ < \infty \} = 1,$$ that is, $x$ is recurrent.

I can see that if a state is essential then it must be recurrent. But how can we show that if $x$ is recurrent then $x$ is essential?