When Convergence Implies Integrable Stopping Times

43 Views Asked by At

Let $(X_n)_{n\in \mathbb{N}}$ be a stochastic process such that: $$ n^{-1}X_n \to k < 0 \quad a.s. $$ I want to prove that if $X_0=0$, and $\tau=\inf\{ n\geq 1: X_n\leq 0 \} $, then $\mathbb{E}[\tau]<\infty$, or at least, under what conditions this is true. My instinct is that, if the Markov chain is time homogeneous (and perhaps under weaker conditions) then it is true, but I'm not sure where to begin with proving it.

1

There are 1 best solutions below

1
On

You mentioned time-homogeneous Markov chains. Here's one idea to explore: you can construct such a chain which moves up the non-negative integers one step at a time, before eventually jumping to state $-1$, with literally any distribution of $\tau $ that you desire!

If you want $P(\tau=m)=\alpha_m$ for $m\geq 1$, then for each such $m$ define the transition probabilities $p_{m-1,-1}=\alpha_m/(\sum_{i=m}^\infty \alpha_i)$, and $p_{m-1,m}=1-p_{m-1,-1}$.

Then you can separately arrange the behaviour of your chain on the negative integers so that starting from $-1$, it always stays negative and satisfies your desired condition that $n^{-1} X_n \to k$ as $n\to\infty$ with probability $1$. (If that convergence holds for the path started at $-1$, then it doesn't matter what extra finite portion you add on the beginning, it will still hold.)