Markov Chain with Hitting Probability Bounded Below by $\alpha>0$

60 Views Asked by At

Let $\{X_t\}$ be a discrete-time Markov chain with state space $S$, and let $T\subseteq S$ be a subset. For $s\in S$, let $p_s$ be the probability of hitting $T$ starting from $s$, so that $p_s=1$ for $s\in T$. Suppose there exists $\alpha>0$ such that $p_s\geq\alpha$ for all $s\in S$. Must we have $p_s=1$ for all $s$?

By a convoluted argument, I can prove the answer is `yes' if, for all $s\in S$, only finitely many states can be reached in one step from $s$. As I cannot find a counterexample, I wonder if the result holds more generally.

1

There are 1 best solutions below

1
On

It does. Consider the event $A:=\{(X_t)$ hits $T\}$. On the one hand, by martingale convergence $$ 1_A =\lim_n P^s[A|\mathcal F_n] $$ a.s. $P^s$. Here $P^s$ is the law of $(X_t)$ started at $s$, and $\mathcal F_n=\sigma(X_0,X_1,\ldots,X_n)$. Evidently, $A\supset A_n:=\{(X_t)$ hits $T$ at some time at or after $n\}$. Therefore, by the Markov property at time $n$: $$ \lim_n P^s[A|\mathcal F_n]\ge P^s[A_n|\mathcal F_n]=P^{X_n}[A]\ge \alpha>0. $$ It follows that $1_A\ge\alpha$, a.s. $P^s$; consequently, $1_A=1$, a.s. $P^s$. That is, $P^s[A]=1$ for each $s\in S$.