Bounding survival probability of an asymmetric random walk by a symmetric one

172 Views Asked by At

Consider two random walks that start from point $x=0$ and time $t=0$ and move either to right $x+1$ or left $x-1$:

1) Walker 1's first move is with equal probability to the right/left. However, any other move depends on the previous move: if he moves to the right in the $k$th step, he is more likely to move to the left in the $(k+1)$th step. Denote its location after the $n$th step by $X_n$.

2) Walker 2 is a symmetric random walker that in every step goes to either right or left with equal probability. Denote its location after the $n$th step by $Y_n$.

Prove that for all $m,n\geq 0$: $$\mathbb{P}\left(\max\limits_{1 \leq k \leq n}X_k\geq m\right) \leq \mathbb{P}\left(\max\limits_{1 \leq k \leq n}Y_k\geq m\right)\tag{1}$$

Intuitively, the Walker 1's location is more centered around the origin since everytime it moves to the right, it has a tendency to go to the left in the next step, but I found it hard to prove. I have verified it using simulation. I thought about induction:

Base case: Let $n=m$. Eq. (1) follows immediately since for $n=m$ means the Walkers have to make $m$ steps to the right, which is less likely for the Walker 1.

Now, assume (1) is true for $m,n$. We want to conclude (1) for $m,n+1$. Observe:

$$\mathbb{P}\left(\max\limits_{1 \leq k \leq n+1}X_k\geq m\right)= \mathbb{P}\left(\max\limits_{1 \leq i \leq n}X_n\geq m\right)\\+ \mathbb{P}\left(\left\{\max\limits_{1 \leq i \leq n}X_n\leq m-1\right\} \cap\{ X_{n-1}=m-1\} \cap \{X_{n-1}=m\} \right).$$

The first term on the right-hand side of above immediately is proved from (1), but the second term cannot be easily bounded. Any idea?

1

There are 1 best solutions below

10
On

$\def\aseq{\stackrel{\mathrm{a.s.}}{=}}\def\peq{\mathrel{\phantom{=}}{}}$Too long for a comment.

It seems that this proposition is not necessarily true. The proposition can be restated as:

Given that $U_1, U_2, \cdots$ are random variables taking values in $\{-1, 1\}$ with\begin{gather*} P(U_1 = -1) = P(U_1 = 1) = \frac{1}{2},\\ P(U_{n + 1} = U_n \mid \mathscr{F}_n) = P(U_{n + 1} = U_n \mid U_n) \leqslant \frac{1}{2}, \end{gather*} where $\mathscr{F}_n = σ(U_1, \cdots, U_n)$, and $V_1, V_2, \cdots$ are i.i.d. with $P(V_1 = -1) = P(V_1 = 1) = \dfrac{1}{2}$. Define $X_n = \sum\limits_{k = 1}^n U_k$, $Y_n = \sum\limits_{k = 1}^n V_k$, then$$ P\left( \max_{1 \leqslant k \leqslant n} X_k \geqslant m \right) \leqslant P\left( \max_{1 \leqslant k \leqslant n} Y_k \geqslant m \right). \quad \forall n \geqslant 1,\ m \geqslant 0 $$

Equivalently, it claims that$$ P(X_1 < m, \cdots, X_n < m) \geqslant P(Y_1 < m, \cdots, Y_n < m). \quad \forall n \geqslant 1,\ m \geqslant 0 $$

Now, inductively construct $\{U_n\}_{n \geqslant 2}$ like this: Suppose that $U_1, \cdots, U_n$ are already constructed, then choose an event $A_n$ independent from $σ(U_1, \cdots, U_n)$ such that $P(A_n) = p_n \leqslant \dfrac{1}{2}$ and take $U_{n + 1} = U_n I_{A_n} - U_n I_{A_n^c}$. Such constructed $\{U_n\}$ satisfies$$ P(U_{n + 1} = U_n \mid \mathscr{F}_n) = P(U_{n + 1} = U_n \mid U_n) = P(A_n \mid U_n) \aseq p_n \leqslant \frac{1}{2}. $$

For $m = 0$, if $n = 2$ and $p_1 < \dfrac{1}{2}$, then$$ P(X_1 < 0, X_2 < 0) = P(U_1 = -1, U_2 = -1) = \frac{1}{2} p_1 < \frac{1}{4} = P(Y_1 < 0, Y_2 < 0). $$ For $m = 1$, if $n = 3$ and $(1 - p_1) p_2 > \dfrac{1}{4}$, then\begin{align*} &\peq P(X_1 < 1, X_2 < 1, X_3 < 1)\\ &= P(U_1 = -1, U_2 = 1, U_3 = -1) + P(U_1 = -1, U_2 = -1, U_3 = 1)\\ &\peq + P(U_1 = -1, U_2 = -1, U_3 = -1)\\ &= \frac{1}{2} (1 - p_1)(1 - p_2) + \frac{1}{2} p_1 (1 - p_2) + \frac{1}{2} p_1 p_2\\ &= \frac{1}{2} (1 - (1 - p_1) p_2) < \frac{3}{8} = P(Y_1 < 1, Y_2 < 1, Y_3 < 1). \end{align*} Thus it seems that the given proposition is not necessarily true in general.