Stopping time for a Markov chain

52 Views Asked by At

Let $X_n$ be a recurrent Markov chain with states $\mathbb N$. Let $T$ be the time of first arrival in $\{1\}$. Is it true that starting from any state $$\mathbb E[X_{min(T,n)}]\to \mathbb E[X_T]=1?$$ my teacher justified it with the dominated convergence theorem, but I don't understand why. Of course $X_{min(T,n)}\to X_T$ pointwise but how can we say that there is $Y\in L^1$ such that $Y>X_{min(T,n)}$ almost everywhere?