An irreducible Markov chain on a countable state space $S$ is recurrent if one of the following equivalent statements holds:
- There exists a finite set $A \subset S$ such that $P_x[T_A<\infty]=1$ for every $x \in A$;
- $P_x[T_x<\infty]=1$ for every $x$ in $S$.
Here $T_A=\min\{n\geq 1:X_n \in A\}$ for any $A \subset S$. And an irreducible Markov chain is transient if it is not recurrent.
How can we show the following result?
Suppose that the Markov chain $X$ is irreducible and transient. Then there exists a finite set $A \subset S$ and a point $x\in S$ such that $P_x[\tau_A<\infty]<1$, where $\tau_A=\min\{n\geq 0:X_n \in A\}$.
Thanks for any comments!
Okay, I see your worry now. It's not actually that bad, though.
So assume that your Markov Chain is transient, i.e., for any finite set $A\subseteq S$, there exists some $x\in S$ such that $P_x[T_A<\infty]<1$. All we have to argue is that there must, in particular, exist such a choice of $x\in S\setminus A$, since for such an $x$ we have $P_x[\tau_A<\infty]=P_x[T_A<\infty]$.
Indeed, assume that $P_x[T_A<\infty]=1$ for all $x\not \in A$ and let $x_0\in A$. We wish to prove that $P_{x_0}[T_A<\infty]=1$ since this would mean that the Markov Chain is, in fact, recurrent. Let $(X_n)_{n\in \mathbb{N}_0}$ denote the Markov chain starting at $X_0=x_0$.
Now, if $X_1\in A$, then clearly $T_A<\infty$, and thus, we get that $$ P_{x_0}(T_A=\infty)=\sum_{x\in S} 1_{x\not \in A} P_x(T_A=\infty) P_{x_0}(X_1=x)=0 $$ We conclude that $P_{x_0}(T_A<\infty)=1$.
Hence, for a given finite $A\subseteq S,$ if there exists some $x\in S$ such that $P_x[T_A<\infty]<1,$ there must exist some $x'\in S\setminus A$ such that $P_{x'}[T_a<\infty]<1$ and hence, $$ P_{x'}[\tau_A<\infty]=P_{x'}[T_A<\infty]<1 $$