Asymetric random walk on the set $\mathbb{Z}$

87 Views Asked by At

Suppose we have random walk $S_n = \xi_1+\xi_2+...+\xi_n$, on the set $\mathbb{Z}$, which starts from zero ($S_0 = 0$) where $\mathbb{P}(\xi_k = j) = p_j$ for $j = -1,0,1,2...$.

I only know that for every $j$, $p_j \geq 0$ and $\sum\limits_{j=-1}^{+\infty}p_j=1$.

I want to find $\mathbb{E}[t_1]$, where $t_1 = \min\{ \ n : S_n = -1\}$, in other words I need to find mathematical expectation of steps before random walk will come the first time to $-1$.

I tried to process with this task as follows. Let's denote $m = \mathbb{E}[t_1]$. Then we can say that we can come to $-1$ from zero in one step with probability $p_{-1}$ or can go from $0$ to $1$ with probability $1-p_{-1}$ and then we will need to make $m$ steps (to come to $-1$) twice: to return from $1$ to $0$ and to go from $0$ to $-1$.

As a result I get the following equation: $m = 1 \cdot p_{-1} + (1+2m)(1-p_{-1}) \Rightarrow m = \frac{1}{1-2(1-p_{-1})}$.

But as I know there is an error in such solution. May somebody explain me where does my solution break?

I don't want to see the right solution and want to handle it by myself but cannot find error in this solution.

1

There are 1 best solutions below

1
On BEST ANSWER

Here's a non-rigorous way to approaching this.

First notice that starting at $0$ , the time to hit $-1$ has the same distribution as the time to hit $0$ , starting at $1$. Taking inspiration from this, we define $t_{j-1}^{j}$ as the time to hit $j-1$ starting at $j$ . Notice that $t_{j-1}^{j}$ has the same distribution as $t_{-1}^{0}$

Now assume that $E(t_{-1}^{0})<\infty$

Then, as we have the following equation ,

$$t_{-1}^{0}=p_{-1}+p_{0}\cdot(1+t_{-1}^{0}) + p_{1}\cdot(1+t_{0}^{1}+t_{-1}^{0})+p_{2}(1+t_{1}^{2}+t_{0}^{1}+t_{-1}^{0})+...$$

Then by taking expectation and using linearity (and Tonelli's theorem)

you have if $m=E(t_{-1}^{0})$ then,

$$m=p_{-1}+p_{0}(1+m)+p_{1}(1+2m)+ p_{2}(1+3m)+...=\\1+p_{0}m+p_{1}\cdot2m+ p_{2}\cdot 3m+...$$

So

$$m=1+\sum_{k=0}^{\infty}p_{k}(k+1)m=1+m\sum_{k=0}^{\infty}kp_{k}+m\sum_{k=0}^{\infty}p_{k}$$

Now notice that $\sum_{k=0}^{\infty}kp_{k}=E(\xi)+p_{-1}$ and $\sum_{k=0}^{\infty}p_{k}=1-p_{-1}$

Hence you have $m=1+m(E(\xi)+p_{-1})+m(1-p_{-1})$ which leads to the situation that $mE(\xi)=-1$ . Thus $m=\frac{-1}{E(\xi)}$ . This is possible if $E(\xi)<0$ . Otherwise, we have a contradiction as $t_{-1}^{0}$ is a postive random variable and hence we must have $m=\infty$ (Which should be intuitive and you can relate to Gambler's ruin in the sense that almost surely the walk would escape to $+\infty$) Also if $E(\xi)=0$, then too you have a contradiction as the equation would then read $1=0$ so again $m=\infty$.

Thus if $-\infty< E(\xi)<0 $ then $m=\frac{-1}{E(\xi)}$ . If $E(\xi)\geq 0$ then $m=\infty$ .

For more reading take a look at Noriss' Markov Chains Thm $1.3.5$ page $17$. Here's a link to the book. But I would advice you to get hold of a djvu version for easier reading.