Simple random walk with biased probability

497 Views Asked by At

$Y_1,Y_2, Y_3, \cdots$ are independent and identically distributed R.V.s s.t. $$Pr (Y_i = 1) = p = 1 − Pr (Y_i = −1),$$ where $p∈(\frac 1 2,1)$. If $$S_n = Y_1+Y_2+Y_3 +\cdots+ Y_n,$$ prove that $$S_n \xrightarrow{\mathrm{a.s.}} \infty. \quad n \to \infty.$$

I see this as a random walk problem related to markovs chain. But I am thinking of using law of large numbers, first show that $E(Y_i)>0$ so that using the law, then $nE(Y_i)=S_n$ and since $n$ approaches $\infty$ then $S_n$ will also approaches $\infty$. If that is right, how would I make it more rigorous?

p.s. $E(Y_i)>0$ is easy to show ($p × 1-q × 1=p-q>0$).

1

There are 1 best solutions below

6
On BEST ANSWER

That is the right idea, and maps directly to a rigorous argument. As you say, we have $E(Y)=2p-1>0,$ and it's also easy to see that $E(|Y|) < \infty,$ so the strong law of large numbers applies and gives us $\frac{1}{n}\sum_{k=1}^n Y_k \to E(Y)>0$ almost surely.

And then, along the same lines you suggest, $\sum_{k=1}^n Y_k\to \infty$ almost surely. (And in case it's not clear, it's all analysis from here on out, no more probability.) If you're looking to clinch that in a little more detail, note for any $\epsilon > 0$ we can choose an $N$ so that $\left|\frac{1}{n}\sum_{k=1}^n Y_k - E(Y)\right| <\epsilon$ for all $n>N.$

Now let $M>0.$ We have $$ S_n> nE(Y)-n\epsilon$$ for any $n>N,$ so choose $\epsilon < E(Y)$ and then enlarge $N$ (if necessary) so that $N> \frac{M}{E(Y)-\epsilon}$ to make $n(E(Y)-\epsilon) >M$ for all $n>N.$