Simple Random Walk and the Strong Markov Property

214 Views Asked by At

Question

Consider $p \in(0,1)$ and a simple walk $\left(Y_{n}\right)_{n \geq 0}$ on $\mathbb{Z}$ with transition matrix

$$ P_{n, m}=\left\{\begin{array}{cc} p & \text { if } m=n+1 \\ 1-p & \text { if } m=n-1 \\ 0 & \text { otherwise. } \end{array}\right. $$

Let us set $$ H_{0}=\inf \left(n \geq 0: Y_{n}=0\right\} $$ be the hitting time of the value $a$ by the chain.

a) Show that almost surely, $$ \lim _{n \rightarrow \infty} \frac{Y_{n}}{n}=2 p-1 $$

b) Let us consider

$$ f(y)=\mathbb{P}_{y}\left(H_{0}<+\infty\right), \forall y \geq 0 . $$

(i) When $p<\frac{1}{2}$, show that $f(y)=1, \forall y \geq 0$.

(ii) Assume that $p \geq \frac{1}{2}$.

Show using the strong Markov property for a relevant stopping time that $$ f(y)=f(1)^{y}, \forall y \geq 1 . $$


My Attempts

(a)

It is clear to see that $Y_{n}=\sum_{k=1}^{n} X_{k}$, with $X_{k}\sim \operatorname{Ber}(p) \Rightarrow$ the mean $\mu=\mathbb{E}\left[X_{k}\right]=p-1-(1-p)=2 p-1$, also, noting $\mathbb{E}\left|X_{k}\right|=p \cdot 1+(1-p)=1<\infty$. Now, defining the following: $S_{n}:=\frac{1}{n} Y_{n}=\frac{1}{n} \sum_{k=1}^{n} X_{k}$. Which allows us to imply that, using the strong law of large number, we see... $$ S_{n}=\frac{1}{n} \sum_{k=1}^{n} X_{k}=\frac{Y_{n}}{n} \rightarrow 2 p-1, \mathbb{P}-a.s \quad \square $$ (b)

(i) From part (a), we have $\lim _{n \rightarrow \infty} \frac{Y_{n}}{n} \rightarrow 2 p-1<0$, since $p<\frac{1}{2}$ i.e $Y_{n}$ is reducing in size, on average, from a positive value $y$. In addition, it is clear to see that the state $0$ is a recurrent state $(\mathbb{P}\{$State $0$ is hit i.o$\}=1)$. Hence, as $n \rightarrow \infty\space\space Y_{n}$ will reduce on average $$ \therefore f(y)=\mathbb{P}_{y}\left\{H_{0}<\infty\right\}=1 \quad \forall y \geqslant 0 \quad \square $$

(ii) Here is where I get stuck, I don't fully understand the Strong Markov Property, I have seen various formulations and still am unsure. Any help would be greatly appreciated :)

1

There are 1 best solutions below

1
On BEST ANSWER

Part (a) is indeed solved immediately by LLN.

There is a martingale argument for part (b.i). I suppose $P_y(A)=P(A|Y_0=y)$. Let $p<1/2$. We consider $Y_0=y$ and $\tau=\inf\{k:Y_k=0\}$. We consider the martingale $Z_n=Y_n-n(2p-1)$. Thus $$y=E[Z_{n\wedge \tau}]=E[\underbrace{Y_{n \wedge \tau}}_{\geq 0}-(n\wedge \tau_y)(2p-1)]\geq -(2p-1)E[(n\wedge \tau_y)]$$ (note $-(2p-1)>0$). So $$E[(n\wedge \tau)]\leq \frac{y}{-(2p-1)}\stackrel{\textrm{MCT}}{\implies}E[\tau] \leq \frac{y}{-(2p-1)}$$ So $$P(\tau\geq n)\leq \frac{E[\tau]}{n}\implies P(\tau=\infty)=0\implies P(\tau<\infty)=1,\,\forall y\geq 0$$ Otherwise for (b.i) you can use the fact that starting from $Y_0=0$ $$Y_n=n\frac{Y_n}{n}\to \infty\cdot \underbrace{(2p-1)}_{<0}=-\infty \textrm{ a.s.}$$ So $Y_n$ hits $-y$ for all $y\geq 0$ a.s. and therefore $P(H_0<\infty)=1,\,\forall y\geq 0$.

For (b.ii). Consider $Y_0=y$. Then for $Y_n$ to hit $0$ we need $Y_{n}$ hitting $1$ and $Y_{H_1+n}$ hitting $0$. Notice that $Y_{H_1+n}$ is an identical copy of $Y_n$, starting from time $H_1$ and position $1$ and by the strong Markov property we have that on $\{H_1<\infty\}$, $Y_{H_1+n}$ is independent of $\mathscr{F}_{H_1}$. But then this all means that $$P_y(H_0<\infty)=P_{y}(H_1<\infty)P_1(H_0<\infty)=P_{y-1}(H_0<\infty)P_1(H_0<\infty)$$ and one can conclude by induction.