One-sided transient Markov Chain

144 Views Asked by At

Consider the state space $S = \{ 0, 1, 2, ...\}$ and a Markov chain defined by the transition probabilities $p(x, x+1) = 1-\alpha_x$, and $p(x, 0) = \alpha_x$. Assume $0 < \alpha_x < 1$ so the chain is irreducible. I have concluded that the chain is transient if and only if $\prod_x(1-\alpha_x) > 0$. I now need to show that being transient implies that $P^x(\tau_y < \infty) = 1$ for $x < y$.

In other words, you are forced to exit to the right if the chain is transient. Intuitively this makes sense, since otherwise you will be constrained to a finite set with positive probability, and chains on a finite state space are recurrent. However I'm not sure how to formalize this. Any help will be appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

Note that $$\prod_{j=0}^\infty (1-\alpha_j) = \mathbb P\left(\bigcap_{j=0}^\infty \{X_{j+1}=j+1\mid X_j=j\} \right)$$ and $$\bigcap_{j=0}^\infty \{X_{j+1}=j+1\mid X_j=j\}= \bigcap_{j=1}^\infty \{X_j\ne0\mid X_0=0 \},$$ so $\prod_{j=0}^\infty (1-\alpha_j)>0$ if and only if the chain is transient.

Suppose the chain is transient. Let $j,k$ be nonnegative integers with $j<k$ and set $$\tau_k = \inf\{n>0:X_n=k \}.$$ Since each state is visited finitely many times, it follows that $$ \mathbb P(\tau_k<\infty\mid X_0=j)=1. $$