Behavior of transient states as $n \rightarrow \infty$

298 Views Asked by At

Let $(X_n)_{n \geq 0}$ be a discrete time-homogeneous Markov chain on the state space $E$. Suppose $T \subseteq E$ is the set of transient states.

Can it be that we stay forever in $T$, with positive probability?

If yes, how can it be that $P_y[X_n=x] \rightarrow 0$ for all $x \in T, y \in E$?

If no, how can one see this intuitively? Couldn't it be that there is an infinite set of transient states, each of the transient states not visited infinitely often, but nevertheless the markov chain always stays within $T$?

2

There are 2 best solutions below

10
On BEST ANSWER

I don't see why not. Consider a simple random walk on $E=\mathbb{Z}^3$. Then all states are transient, i.e. $T=E$, and naturally the random walk never leaves $T$.

Now, there is a theorem that says that in the above settings, and for any two states $x,y$, we have $$ \mathbb{P}_x[X_n=y] \leq \mathbb{P}_x[X_n=x]. $$ And since $x$ is a transient state, we have $\lim_{n\to\infty} \mathbb{P}_x[X_n=x]=0$.

I hope this clarifies the picture.

2
On

Another simple example:

Consider a random walk on the real line with $p=1$ of a jump to the right. Then we have an infinite number of open classes, no closed classes, and all states are transient. Naturally, then, we stay within the set of transient states.