Limit involving a symmetric random walk

182 Views Asked by At

Let $(S_n)_{n\ge0}$ be a symmetric, simple random walk. Given $a \in \mathbb{R}$, find the following $$ \lim_{n\to \infty}\mathbb{P}\left(n^{-3/2}\sum_{i=1}^n S_i>a\right)$$

My method: Let $S_0 = 0$ and $S_n = \epsilon_1 + \ldots + \epsilon_n$, where the $\epsilon_i$ are i.i.d. Bernoulli random variables, so that $\mathbb{P}(\epsilon_i = 1) = 1/2 = \mathbb{P}(\epsilon_i = −1)$. I then wish to find $p_n = \mathbb{P}(∀ 1 ≤ i ≤ n, S_i ≥ 0)$, by looking at the stopping time:

$$\tau = \inf\{k \ge 1, S_k = −1\}$$ Noting that... $$p_n = \mathbb{P}(S_n ≥ 0, \tau > n) = \mathbb{P}({S_n \ge 0} \ {S_n \ge 0, \tau < n}) = \mathbb{P}(S_n \ge 0) − \mathbb{P}(S_n \ge 0, \tau < n)$$

I've tried a few things from here but get unstuck, any advice one right whether i have tackled this the right way? To deal with the $\mathbb{P}(S_n \ge 0, \tau < n)$ term i was thinking of introducing a new random walk which is reflected at time $\tau$ w.r.t. the level -1

2

There are 2 best solutions below

0
On BEST ANSWER

As much as Mike's answer was short and simple, I wanted to see if i could get to the same end result as him using my approach and I did so as below:

If we use another symmetric r.w., say $\hat{S}_n$ which is reflected at time $\tau$ with respect to the level −1, that is

$ \hat{S}_n= \begin{cases} S_j&\text{if}\, j\le \tau\\ -2-j&\text{else}\ \end{cases} $

If $\tau <n$ then $S_n \ge 0$ is equivalent to $\hat{S}_n \le −2$, so $\mathbb{P}(S_n \ge 0, \tau < n) = \mathbb{P}(\hat{S}_n \le −2, \tau < n)$, but $\{\hat{S}_n \le −2\} \subset \{\tau < n\}$, therefore we get $$p_n = \mathbb{P}(S_n \ge 0) − \mathbb{P}(\hat{S}_n \le −2)$$ which by symmetry and the reection principle becomes

$p_n = P (S_n \in \{0, 1\}) = \begin{cases} \binom{n}{n/2}2^{-n} &\text{if n is even}\, \\ \binom{n}{(n-1)/2}2^{-n}&\text{else}\ \end{cases}$

Then, by Stirling's formula we easily see that $c_n−1/2 ≤ p_n ≤ d_n−1/2$ for some positive constants $c$ and $d$.

We notice that $$ \sum_{j=1}^n S_j = n\epsilon_1+(n-1)\epsilon_2+\ldots \epsilon_n$$

which of course has the same distribution as $\sum_{j=1}^n j\epsilon_j$ ; a sum of independent random variables. The variance is $$\sigma^2_n=Var\left(\sum_{j=1}^n j\epsilon_j\right)= \sum_{j=1}^n j^2 = \frac{n(n+1)(2n+1)}{6}$$ Now let $X_j = j\epsilon_j/\sigma_n$, where $X_j$ satisfy Lindeberg's condition

Therefore, by the C.L.T we get that

$$ \lim \mathbb{P} \left(n^{-3/2}\sum S_j >a \right) = \lim \mathbb{P} \left(\sum X_j >an^{3/2}/\sigma_n \right) = \mathbb{P}(Y > \sqrt{3}a) = 1-\Phi(\sqrt{3}a) $$ where $Y$ is a standard Gaussian random variable

2
On

As $n\to\infty$, the process $X_t={n^{-1/2}}S_{\lfloor nt\rfloor}$ converges in probability to Brownian motion $W_t$ on $[0,1]$. Your sum, $\sum_{i=1}^n (1/n)\cdot\frac1{\sqrt n}S_{n}$ is a Riemann sum for $X_t$, so it converges in distribution to $\int_0^1 W_t\,dt$. The integral $\int_0^1 W_t\,dt$ is a normal variable with mean zero and variance $1/3$. Therefore, $$\mathbb P({n^{-3/2}}\sum_{i=1}^n S_{n}>a)\to \mathbb P(N(0,1/3)>a)=\mathbb P(N(0,1)>\sqrt{3}a)=1-\Phi(\sqrt{3}a).$$