Oscillating once imply oscillating indefinitely

47 Views Asked by At

Let $\mathbb{S}$ be countable and $X_n$ be the coordinate map on $\mathbb{S}^\mathbb{N}$. Let $p$ be a transitional probability so that $X_n$ is a irreducible transient Markov chain. Let $\tau_A = \inf \{n \ge 0: X_n\in A\}$ be the first time it hits $A$.

Let $A,B \subset \bar{\mathbb{S}}$ be disjoint where $\bar{\mathbb{S}} = \mathbb{S}\cup\partial \mathbb{S}$ and $\partial \mathbb{S}$ is the martin boundary. Suppose that $P^x (\tau_A <\tau_B <\infty) > 0$, does it mean that there is a positive probability that $X_n$ will oscillate between $A$ and $B$, indefinitely? More rigorously, if $\tau_B^0 = -1$ and \begin{align} \tau_A^k &= \tau_B^{k-1} + \tau_A \circ \theta^{\tau_B^{k-1}} \\ \tau_B^k &= \tau_A^k + \tau_B \circ \theta^{\tau_A^k} \end{align} Then is it true that $P^x(\tau_A^k,\tau_B^k < \infty, k \ge 1) > 0$?

EDIT: I'm actually trying to form a contradiction. Since $X_n \rightarrow X_\infty$ where $X_\infty:\mathbb{S}^\mathbb{N} \rightarrow \partial \mathbb{S}$, it cannot oscillate indefinitely and thus $P^x (\tau_A <\tau_B <\infty) = 0$