Conditions on stopping time being finite

174 Views Asked by At

Let $(Y_n)_{n \in \mathbb{N}}$ be independent random variables taking values in $\{-1, 0, 1\}$ such that $EY_n = 0$. Let the process $(X_n)_{n \in \mathbb{N}}$ with $X_n = \sum_{k = 1}^n Y_k$. Let $\tau = \inf \{n : X_n = 1\}$. For what conditions on $Y_n$ is $\tau$ finite almost surely?

I know if $Y_n$ are iid, then if $P(Y_n = 0) < 1$, then $\tau$ will be finite a.s. I am having trouble with this exercise though. It's seems possible to me that the condition is $\sum (1 - P(Y_n = 0)) = \infty$. Any pointers?

1

There are 1 best solutions below

0
On

The reasoning below is based on an argument included in Probability with Martingales (David Williams): Hitting times for simple random walk.

Firstly, note that the condition $E[Y_n]=0$ implies that the $Y_n$ is symmetric.
So, setting $\ p_n = P(Y_n=0)$, we get $\ P(Y_n=-1)=P(Y_n=1)=\frac{1-p_n}{2}$.

For all $\ \theta \in \mathbb{R}$, define the moment generating function of $Y_n$ $$\ \psi_n(\theta)\ =\ E[e^{\theta Y_n}]\ =\ p_n+\frac{1-p_n}{2}(e^\theta+e^{-\theta})\ =\ p_n+(1-p_n)\cosh(\theta)$$

and their product:

$$ \Psi_n(\theta)=\prod_{k=1}^n \psi_k(\theta), \qquad \Psi_0(\theta)=1. $$

Note that, as $\psi_n(\theta) \geq 1$, $\Psi_n(\theta)$ is increasing and its limit is finite if and only if $$ \sum_n (1-p_n) < +\infty. $$

Finally, define the two martingales (for $n \geq 0$): $$ X_n = \sum_{k=1}^n Y_k\ , \qquad X_0=0 $$ $$ M_n = \frac{e^{\theta X_n}}{\Psi_n(\theta)}, $$ and the stopping time $\tau= \inf\{n\geq0 : X_n=1\}$.

Applying the Optional Stopping Theorem (OST), we get: $$ 1=E[M_{\tau \wedge n}] = E\left[ \frac{e^{\theta X_{\tau \wedge n}}}{\Psi_{\tau \wedge n}(\theta)}\right]. $$

Now, lets restrict ourselves to $\theta>0$ so that $$\frac{e^{\theta X_{\tau \wedge n}}}{\Psi_{\tau \wedge n}(\theta)} \leq e^{\theta}.$$

Now assume that $\Psi_n(\theta)$ diverges.
Take the limit and apply Dominated Convergence Theorem (DOM), to get: $$ 1 = E\left[\frac{e^{\theta }}{\Psi_{\tau }(\theta)}\ \text{I}_{\tau < + \infty}\right] + E\left[\ \lim_n \frac{e^{\theta X_{n }}}{\Psi_{n }(\theta)}\ \text{I}_{\tau = + \infty}\right] $$ where the second addend vanishes as $\Psi_n(\theta)$ diverges.

Therefore, rearranging: $$ e^{-\theta} = E\left[\{\Psi_{\tau }(\theta)\}^{-1}\ \text{I}_{\tau < + \infty}\right]. $$

Using DOM again, the last expression as $\theta \downarrow 0$ yields: $$ 1=E[\text{I}_{\tau < + \infty}]. $$

This concludes the case $\sum_n (1-p_n) = + \infty$.

For the other case, we have that the martingale $X_n$ is uniformly integrable (it is bounded in L2). Therefore, $X_n$ converges to an integrable r.v. $X_{\infty}$ and the OST gives: $$ 0 = E[\text{I}_{\tau < + \infty}] + E[X_{\infty}\ \text{I}_{\tau = + \infty}] . $$

Assuming, $\text{I}_{\tau = + \infty}$ is a null set, we get $0 = E[\text{I}_{\tau < + \infty}]$ which is a contradiction.

Lastly, note that unless the trivial case $p_n=1 \ \forall n$, $\ P(\tau < + \infty)>0$.

$$ $$

The argument above uses martingales theory to derive the results.
However, I would like to give you some intuition that, if formalized, can lead to more direct proofs of these results.

The process $X_n$ is a modified simple random walk. The modification is that the random walk could take some "lazy" steps by moving forward in time but staying still in space i.e. $Y_n=0$.
In contrast, when it moves up or down, it does so symmetrically i.e. like a simple random walk.

The condition $\sum_n (1-p_n) < +\infty$, by Borel-Cantelli Lemmas (1 and 2), defines whether the walk could take infinitely many non-lazy steps or finitely many.

More precisely, when $\sum_n (1-p_n) < +\infty$, $\ P(\liminf Y_n=0)=1$. So the walk $X_n$ will move up down up for a bit of time; then will start to get constant.
When $\sum_n (1-p_n) = +\infty$, $\ P(\limsup Y_n \ne 0)=1$. So the process will always keep moving and intuitively, if we pretend to remove the lazy steps, all we are left with is the simple random walk.