How to combine the four Theorems in order to prove the statement?

334 Views Asked by At

I have a question concerning a statement about Random Walks on $\mathbb{Z}$. Let $F$ be a distribution on $\mathbb{Z}$ which has mean $0$ and finite variance. Let $\left\{X_1,X_2,\ldots\right\}$ be an i.i.d. sequence of random variables with distribution $F$.

Now there is the following statement:

$$ \text{Prob}\left\{S_1>0,S_2>0,\ldots,S_n>0\right\}\sim\sigma((2\pi n)^{1/2} E(S_N))^{-1}, $$ where $\sigma$ is the standard deviation of the distribution $F$, $N$ is the stopping time corresponding to the first entrance into $\left\{1,2,\ldots\right\}$ and $E$ denotes expected value.

For the proof, it is said, that the following four Theorems have to be combined. I tried a lot. I think one can start with (7.14) on page 415 (and maybe prove that statement afterwards), saying that (see notation below) $$ \frac{1-\tau(s)}{1-s}\sim \frac{e^{-c}}{\sqrt{1-s}}, s\to 1-. $$ The next step probably is to apply Theorems 2, 3, 4. But I am not sure how to do that exactly. Maybe you can help me.

--

Some notation before I give the Theorems.

Let $\mathfrak{T}_1$ be the epoch of the first entry into the (strictly) positive half-axis defined by $$ \left\{\mathfrak{T}_1=n\right\}=\left\{S_1\leq 0,\ldots,S_{n-1}\leq 0, S_n >0\right\}. $$ Let $\tau_n:=\mathbb{P}(\left\{S_1\leq 0,\ldots,S_{n-1}\leq 0, S_n >0\right\})$ and let $\tau(s)=\sum_{n=1}^{\infty}\tau_n s^n,~~0\leq s\leq 1$ be the generating function of $\mathfrak{T}_1$.

A positive function $L$ is said to vary slowly at infinity if for every fixed $x$, $\frac{L(tx)}{L(x)}\to 1, t\to\infty$.


Theorem 1 $$ \mathbb{P}\left\{\mathfrak{T}_1 > n\right\}\sim\frac{1}{\sqrt{\pi}}e^{-c}\frac{1}{\sqrt{n}},~~~c:=\sum_{n=1}^{\infty}\frac{1}{n} [\mathbb{P}\left\{S_n > 0\right\}-\frac{1}{2}]. $$


Theorem 2

The generating function of the probabilities $$ p_n=\mathbb{P}\left\{S_1 >0, S_2 >0,\ldots,S_n >0\right\} $$ is given by $$ p(s)=\frac{1}{1-\tau(s)}. $$ That is $$ \log p(s)=\sum_{n=1}^{\infty}\frac{s^n}{n}\mathbb{P}\left\{S_n>0\right\}. $$


Theorem 3

Let $q_n\geq 0$ and suppose that $$ Q(s)=\sum_{n=0}^{\infty}q_n s^n $$ converges for $0\leq s <1$. If $L$ varies slowly at infinity and $0\leq p <\infty$, then each of the two relations $$ Q(s)\sim\frac{1}{(1-s)^p}L\left(\frac{1}{1-s}\right), s\to 1-~~~(*) $$ and $$ q_0+q_1+\ldots+q_{n-1}\sim\frac{1}{\Gamma(p+1)}n^p L(n), n\to\infty $$ implies the other. Furthermore, if the sequence $\left\{q_n\right\}$ is monotonic and $0<p<\infty$, then $(*)$ is equivalent to $$ q_n\sim\frac{1}{\Gamma(p)}n^{p-1}L(n), n\to\infty. $$


Theorem 4

If $F$ has zero expectation and variance $\sigma^2$ the series $$ \sum_{n=1}^{\infty}\frac{1}{n} [\mathbb{P}\left\{S_n>0\right\}-\frac{1}{2}]=c $$ converges at least conditionally, and $$ \mathbb{E}(S_N)=\frac{\sigma}{\sqrt{2}}e^{-c}. $$


The Theorems are out of Feller, An Introduction to Probability Theory and its Applications which you may see here.

1

There are 1 best solutions below

0
On BEST ANSWER

Let $Q(s) = \frac{1-\tau(s)}{1-s}$. It is not hard to check that the coefficient $q_n$ of $s^n$ in $Q(s)$ is $$ q_n = 1-\sum_{m=0}^n \tau_m = \Pr[\mathfrak{T}_1 > n] \sim \frac{1}{\sqrt{\pi}} e^{-c} \frac{1}{\sqrt{n}}, $$ using Theorem 1. Clearly $q_n$ is monotonic. Taking $p = 1/2$ and using $\Gamma(1/2) = \sqrt{\pi}$, this shows that $$ q_n \sim \frac{1}{\Gamma(\pi)} n^{p-1} L(n), \qquad L(n) = e^{-c}. $$ Clearly $L(n)$ is slowly varying at infinity (it is constant), so Theorem 3 shows that as $s \to 1-$, $$ \frac{1-\tau(s)}{1-s} \sim \frac{e^{-c}}{\sqrt{1-s}}. $$ Theorem 2 implies that as $s \to 1-$, $$ p(s) = \frac{1}{1-s} \frac{1-s}{1-\tau(s)} \sim \frac{e^c}{\sqrt{1-s}}. $$ Applying Theorem 3 again for $p(s)$, $p=1/2$ and $L(n) = e^c$ (clearly $p_n$ is monotonic), we deduce that $$ p_n \sim \frac{1}{\sqrt{\pi}} n^{-1/2} e^c. $$

Theorem 4 implies that $$ p_n \sim \frac{\sigma}{\sqrt{2\pi}} n^{-1/2} \mathbb{E}(S_N)^{-1} = \sigma ((2\pi n)^{1/2} \mathbb{E}(S_N))^{-1}, $$ which is your statement.