Transience implies convergence to infinity

39 Views Asked by At

Let $\mathbb{S}$ be countable. Let $X_n$ be the coordinate map on $\mathbb{S}^\mathbb{N}$, i.e., $X_n (\omega) = \omega_n$ for every $\omega \in \mathbb{S}^\mathbb{N}$ and let $p(x,y)$ be a transitional probability on $\mathbb{S}$ such that $X_n$ is an irreducible transient homogeneous Markov chain.

Let $F$ be a finite subset containing the element $0\in \mathbb{S}$ and let $T_F = \sup \{n \ge 0: X_n\in F\}$ be the last index of the path $X_n$ hitting $F$. Then why does $P^0(T_F <\infty) = 1$, i.e., if the Markov chain starts at $0$, then it must leave $F$ in finite step?

1

There are 1 best solutions below

2
On

Note: In the case where $F = \{0\}$, this is the definition of transience (or is equivalent to it). The essential question here is why we have an equivalent definition when $F$ can be an arbitrary finite set.

Sketch of proof of the desired result: Suppose the opposite; if $T_F = \infty$ with positive probability, then there is a positive probability of infinitely many returns back to $F$. But since $F$ is a finite set, it follows that $\inf_{x \in F} P^x(T_0 < \infty)$ is positive. So, this walk would be presumed to have infinite visits to $F$, and each time it does this, it has a probability that can be uniformly bounded above 0 of returning to the $0$ state. That would force a positive probability of infinite returns to $0$, violating transience.