Simple random walk, Martingales, stopping time

67 Views Asked by At

Suppose $S_n = X_1 + \dots + X_n $ is a simple random walk starting at 0. For any K, let $$T = \min \{n: | S_n| =K\}. $$ $\bullet$ Explain why for every j, $$ \mathbb{P}\{T \leq j +K | T > j\} \geq 2^{-K}$$ ANSWER: The inequality $ \mathbb{P} \{ T \leq j + K | T > j \} \geq 2^{-K}$ is related to properties of simple random walk. In a simple random walk, we start at 0 and at each step, we move +1 or -1 with equal probability 1/2

This means that for any given step, there’s a 1/2 chance of moving in either direction. Now, let’s consider the event $\{T \leq j + K | T > j\}$. This event is saying that if we haven’t hit the boundary $\pm K$ by time j, the probability that we will hit the boundary by time j + K is at least $2^{-K}$

This can be intuitively understood as follows: If we haven’t hit the boundary by time j, we are somewhere in the interval (-K, K). To hit the boundary by time j + K, we need to take K steps in one direction. Since each step is independent and has a 1/2 chance of going in either direction, the probability of taking K steps in one direction is $(\frac12)^K = 2^{-K}$ Is this answer correct?

$\bullet$ Show that there exist $ c < \infty, \alpha > 0 $ such that $$\forall j, \mathbb{P}\{T > j\} \leq ce^{-\alpha j}$$

Conclude that $\mathbb{E}[T^r] < \infty$ for every r > 0.

ANSWER:

To prove the inequality $\mathbb{P}\{T > j\} \leq ce^{-\alpha \cdot j}$, we can use a technique called the reflection principle. The reflection principle states that for a simple random walk, the event $\{T > j\}$ is equivalent to the event that the random walk first reaches the level $K$ after $j$ steps, and then returns to the origin before hitting the level $-K$.

Let's define $A_j$ as the event that the random walk starting at 0 first reaches the level $K$ after $j$ steps. Similarly, let $B_j$ be the event that the random walk starting at 0 returns to the origin before hitting the level $-K$ after reaching the level $K$ at some point.

Using the reflection principle, we have $\mathbb{P}\{T > j\} = \mathbb{P}\{A_j \cap B_j\}$. Since these events are independent, we can write:

$$\mathbb{P}\{T > j\} = \mathbb{P}\{A_j\} \cdot \mathbb{P}\{B_j\}.$$

Now, we can use the inequality given in the problem statement: $\mathbb{P}\{ T \leq j + K | T > j\} \geq 2^{-K}$. Rearranging this inequality, we get $\mathbb{P}\{ T > j\} \leq 1 - 2^{-K}$.

Using this inequality, we can bound $\mathbb{P}\{A_j\}$ as follows:

$$\mathbb{P}\{A_j\} = \mathbb{P}\{A_j \cap T > j\} + \mathbb{P}\{A_j \cap T \leq j\} \leq \mathbb{P}\{T > j\} + \mathbb{P}\{T \leq j\} \leq 1 - 2^{-K} + \mathbb{P}\{T \leq j\}.$$

Now, let's denote $p = \mathbb{P}\{T \leq j\}$. We can rewrite the inequality as $\mathbb{P}\{A_j\} \leq 1 - 2^{-K} + p$.

Next, we consider $\mathbb{P}\{B_j\}$. Since $B_j$ is the event that the random walk returns to the origin before hitting $-K$ after reaching $K$, we can use the reflection principle to relate it to the probability of hitting $-K$ before hitting $K$ starting from 0. Denoting this probability as $q$, we have $\mathbb{P}\{B_j\} = q$.

Combining the inequalities for $\mathbb{P}\{A_j\}$ and $\mathbb{P}\{B_j\}$, we get:

$$\mathbb{P}\{T > j\} = \mathbb{P}\{A_j\} \cdot \mathbb{P}\{B_j\} \leq (1 - 2^{-K} + p) \cdot q.$$

Now, we need to find an upper bound for $q$. Since $q$ is the probability of hitting $-K$ before hitting $K$ starting from 0, it can be shown that $q \leq \frac{1}{2^K}$. Therefore, we have:

$$\mathbb{P}\{T > j\} \leq (1 - 2^{-K} + p) \cdot \frac{1}{2^K}.$$

Now, we want to find a constant $c$ such that $(1 - 2^{-K} + p) \cdot \frac{1}{K} \leq ce^{-\alpha \cdot j}$. We can choose $c = \frac{1}{2^K}$ and $\alpha = -\ln(1 - 2^{-K} + p)$. With these choices, the inequality becomes:

$$\mathbb{P}\{T > j\} \leq \frac{1}{2^K} \cdot e^{-\alpha \cdot j}.$$

Therefore, we have shown that $\mathbb{P}\{T > j\} \leq ce^{-\alpha \cdot j}$ for some finite constant $c$ and $\alpha > 0$.

Finally, to conclude that $\mathbb{E}[T^r] < \infty$ for every $r > 0$, we can use the moment generating function (MGF) of $T$. The MGF is defined as $\mathbb{E}[e^{tT}]$. Using the inequality we derived earlier,To prove that $\mathbb{E}[T^r] < \infty$ for every $r > 0$, we can use the moment generating function (MGF) of $T$. The MGF is defined as $\mathbb{E}[e^{tT}]$.

From the previous inequality $\mathbb{P}\{T > j\} \leq ce^{-\alpha \cdot j}$, we have $\mathbb{P}\{T \leq j\} \geq 1 - ce^{-\alpha \cdot j}$. Therefore, we can bound the MGF as follows:

$$\mathbb{E}[e^{tT}] = \sum_{j=0}^{\infty} e^{tj}\mathbb{P}\{T = j\} = \sum_{j=0}^{\infty} e^{tj}\left(\mathbb{P}\{T \leq j\} - \mathbb{P}\{T \leq j-1\}\right).$$

Using the lower bound on $\mathbb{P}\{T \leq j\}$, we can write:

$$\mathbb{E}[e^{tT}] \geq \sum_{j=0}^{\infty} e^{tj}\left(\left(1 - ce^{-\alpha \cdot j}\right) - \left(1 - ce^{-\alpha \cdot (j-1)}\right)\right).$$

Simplifying the expression, we obtain:

$$\mathbb{E}[e^{tT}] \geq c\sum_{j=0}^{\infty} \left( e^{tj + \alpha}- e^{-tj}\right)e^{-\alpha \cdot j} = c\sum_{j=0}^{\infty} e^{tj}(e^{\alpha}-1)e^{-\alpha \cdot j}.$$

The sum on the right-hand side is a geometric series, and it converges to a finite value as long as $|e^{t-\alpha}| < 1$. This condition holds for sufficiently small values of $t$ and $\alpha$. Therefore, we have:

$$\mathbb{E}[e^{tT}] \geq c\sum_{j=0}^{\infty} e^{tj}( e^{\alpha}-1)e^{-\alpha \cdot j} < \infty.$$

Since the MGF is finite in a neighborhood of zero, it implies that all moments of $T$ exist. In particular, we have $\mathbb{E}[T^r] < \infty$ for every $r > 0$.

Is this answer correct?

$\bullet$ Let $M_n = S^2_n - n $ Show there exist $ C < \infty$ such that $$\forall n, \mathbb{E}[M^2_{n\wedge T}] \leq C $$
ANSWER:

To show that there exists a finite constant $C$ such that $\forall n, \mathbb{E}\{ M^2_{n\wedge T}\} \leq C$, we can utilize the concept of stopping times and apply the optional stopping theorem.

Let's define a stopping time $T$ as the first time at which the sequence $(S_n)$ hits a certain boundary. In other words, $T = \inf\{n : S_n = a \text{ or } S_n = b\}$, where $a$ and $b$ are fixed boundaries. Note that $T$ is a stopping time because it depends only on the values of the sequence $(S_n)$ up to time $n$.

Now, we want to analyze the random variable $M_{n\wedge T}$, which denotes the process $(M_n)$ stopped at time $T$ or at time $n$ if $T > n$. In other words, $M_{n\wedge T} = S_{n\wedge T}^2 - (n\wedge T)$.

Applying the optional stopping theorem, we have:

$$\mathbb{E}\{M_{n\wedge T}\} = \mathbb{E}\{S_{n\wedge T}^2 - (n\wedge T)\} = \mathbb{E}\{S_{n\wedge T}^2\} - \mathbb{E}\{(n\wedge T)\}.$$

Now, note that $M_{n\wedge T}$ is a bounded random variable because $M_{n\wedge T} \leq \max\{a^2, b^2\}$. Therefore, by the optional stopping theorem, we can conclude that:

$$\mathbb{E}\{M_{n\wedge T}\} = \mathbb{E}\{S_{n\wedge T}^2\} - \mathbb{E}\{(n\wedge T)\} = \mathbb{E}\{M_{0}\} = \mathbb{E}\{S_0^2\} - \mathbb{E}\{0\} = \mathbb{E}\{S_0^2\}.$$

In other words, the expected value of $M_{n\wedge T}$ is constant and does not depend on $n$ or $T$. Let's denote this constant by $C = \mathbb{E}\{S_0^2\}$. Therefore, we have:

$$\forall n, \mathbb{E}\{ M^2_{n\wedge T}\} = \mathbb{E}\{ S^4_{n\wedge T}\} \leq C,$$

where $C$ is a finite constant. Hence, we have shown that there exists a finite constant $C$ such that $\forall n, \mathbb{E}\{ M^2_{n\wedge T}\} \leq C$.