Verification of the hypotheses and application of Doobs' optional stopping theorem to 2D simple symmetric random walk

81 Views Asked by At

In the following problem there are a few things about the application of Doobs's optional stopping theorem (DOST)that I am not sure about. It would be great if you guys could clear this up. Below the statement of the theorem that I am using

enter image description here

Observation: Note that the conditions of the DOST theorem only need to be checked for the stopped process

Let $M_n=D_n^2-n$ I have already proven that it is a martingale. My question is about (b). I now I have to apply DOST to it,so this is what I did. I want to apply the theorem -case iii)- I have to verify that $M_n$ is of bounded increments and that $E[T]<\infty$. Following the hint, I do it for the stopped process: $M_n^{\nu_r}:=M_{n\land \nu_r }$, provided that $\nu_r$ is a stopping time:

$M_n^{\nu_r}$ is of bounded increments:

$|M_{n+1}^{\nu_r}-M_n^{\nu_r}|=| (D_{n+1}^{\nu_r})^2-(n+1)-(D_n^{\nu_r})^2-n |=|(D_{n+1}^{\nu_r})^2-1-(D_n^{\nu_r})^2|\le |(D_{n+1}^{\nu_r})^2-(D_n^{\nu_r})^2| +1 $

On the other hand $D_{n+1}^2=|| S_{n+1}||^2=|| S_n+X_{n+1}||^2=||S_n||^2+||X_{n+1}||^2+2S_n\cdot X_{n+1} =D_n^2 + 1 +2S_n\cdot X_{n+1}\le D_n^2 + 1 +2||S_n|| ||X_{n+1}||=D_n^2 + 1+2D_n$

so at $n=\nu_r: $

$$D_{\nu_r+1}^2-D_{\nu_r}^2\le 1+2 D_{\nu_r}\le 1+2(r+1)=2r+3 $$

where in the last step I have used that

$$r+1 \ge D_{\nu_r} > r \tag {*}$$

because I only move in steps of 1 unit and at the stopped time I can't be more that 1 unit away from the threshold distance.

$E[\nu_r]<\infty$ : No idea

Using DOST ( case iii ), I get that

$$E[D_{\nu_r}^2-\nu_r]=E[M_{\nu_r}]=E[M_0]=E[D_0^2]$$

, but I don't how to compute this , but I know it has to be $0$ so that I can have $E[D_{\nu_r}^2]=E[\nu_r]$

which I use in (*):

$$r+1 \ge D_{\nu_r} > r$$ $$(r+1)^2 \ge D_{\nu_r}^2 > r^2$$ $$(r+1)^2 \ge E[D_{\nu_r}^2]=E[\nu_r] > r^2$$ $$(r+1)^2/r^2 \ge r^{-2}E[\nu_r] > 1$$ $$\lim_{r\to \infty}r^{-2}E[\nu_r] =1$$

My questions are

  1. Why is the observation true ? what difference does it make to verify the hypothesis for the whole processes rather that for stopped process? The theorem says I have to verify it for the whole process, so I wonder where this remark comes from

  2. How do I check that $\nu_r$ is a stopping time? According to my definition, I have to prove that $\{\nu_r \le n\} \in \mathcal F_n$ , where $\mathcal F_n$ is the natural filtration of the walk: $\mathcal F_n=\sigma(S_0,S_1,S_2,...S_n)$ and how do I prove that $E[\nu_r]=E[\inf\{n: D_n >r \}]<\infty$ ? I have no idea on how to compute the expectation of something with $\inf$

  3. In the application of the theorem: Why would $E[D_0^2]=0 $ be true ?


The theorem I am using

enter image description here

enter image description here

1

There are 1 best solutions below

10
On

This is only addressing the claim that $\mathbb{E}[\nu_r]$ is finite. We first use the following identity for nonnegative integer random variables $$\mathbb{E}_0[\nu_r] = \sum_{n=1}^\infty \mathbb{P}_0(\nu_r \geq n).$$ In this case the subscript 0 means the random walk started at the origin. Our goal is to bound the probabilities on the right hand side.

Pick $M = 2r$. Consider the event $A$ where the random walk starts at $x_0 \in \mathbb{Z}^2$ and $\nu_r$ is defined as the time it takes to get euclidean distance $r$ away from the origin, the random walk goes to the right $M$ times, i.e., $S_0 = x_0, S_1 = x_0 + (1,0), ... S_M = x_0 + (M,0).$ Note that $\mathbb{P}_{x_0}(A) > 0$, so in particular, the probability of $A_{}^C$, (the event that $A_{}$ does not occur) is less than 1.

It follows that $$\mathbb{P}_{x_0}(\nu_r \geq M) \leq \mathbb{P}_{x_0}(A_{}^C)$$ because $A_{} \subset \{\nu_r < M\}$.

Relabel the expectation as follows and use the following bound: $$\mathbb{E}_0[\nu_r] = \sum_{n=0}^\infty \sum_{m=1}^M \mathbb{P}_0(\nu_r \geq nM+m) \leq \sum_{n=0}^\infty \sum_{m=1}^M \mathbb{P}_0(\nu_r \geq nM).$$

We are basically done: if $\nu_r \geq nM$, it means that event $A$ did not occur $n$ times. (We are throwing some conditioning under the rug where we are conditioning on the value of $S_M,S_{2M},S_{3M},..,S_{(n-1)M}$ and using that increments of the markov chain are independent, and that $A$ doesn't depend on the initial value of the chain.) This leads to the bound $$\mathbb{P}(\nu_r \geq nM) \leq \mathbb{P}(A^C)^n.$$

Now use the sum of a geometric series to conclude that since $\mathbb{P}(A^C) < 1$, $\mathbb{E}[\nu_r]$ is finite.

To elaborate on the conditioning (you should do this for yourself,) let $B_r$ be the set of points in $\mathbb{Z}^2$ that are less than $r$ distance from the origin. Then note that $$\mathbb{P}_0(\nu_r \geq nM) = \sum_{x \in B_r}\mathbb{P}_0(\nu_r \geq nM|S_{(n-1)M} = x)\mathbb{P}_0(S_{(n-1)M} = x, \nu_r \geq (n-1)M).$$ We can get rid of the left factor by using our event and some relabelling: $$\mathbb{P}_0(\nu_r \geq nM|S_{(n-1)M} = x) = \mathbb{P}_x(\nu_r \geq M) \leq \mathbb{P}_x(A^C) = \mathbb{P}_0(A^C).$$ We end up with $$\mathbb{P}_0(\nu_r \geq nM) \leq \mathbb{P}_0(A^C)\mathbb{P}_0(\nu_r \geq (n-1)M)$$ by summing over $x \in B_r$ for the values of $S_{(n-1)M}$.