An application of Kolmogorov's 0-1 law to Bernoulli random variables

70 Views Asked by At

In Shiryaev's Probability 2, the following application of Kolmogorov's $0$-$1$ law is given,

Let $\xi_1, \xi_2, \ldots, \xi_n$ be a sequence of $i.i.d.$ Bernoulli random variables with $\Pr(\xi_i=-1)=\Pr(\xi_i=1)=\frac{1}{2}$, then let $S_n = \sum_{i=1}^n \xi_i$, we have $$\Pr(S_n=0 \ i.o.)=1$$

The proof uses the following claim, which is not immediately obvious to me

$$A = \left\{\limsup \frac{S_n}{\sqrt{n}}=\infty, \liminf \frac{S_n}{\sqrt{n}}=-\infty\right\} \subseteq B=\{S_n=0 \ i.o.\}$$

Then the author proceeds to show this event $A$ has probability $1$ by noting that $\{\limsup \frac{S_n}{\sqrt{n}} > N\} \cap \{\liminf \frac{S_n}{\sqrt{n}}<-N\} \downarrow A$ as $n\to \infty$, and those events are tail events with nonzero probability, hence have probability $1$. (a related proof of nonzero probability can be found in this thread).

Why is the inclusion of events $A\subseteq B$ true in this case? What would be the intuition behind this?

1

There are 1 best solutions below

0
On

Note that $S_n$ only takes even integers and $S_n$ can only change by $\pm2$ as $n$ increases by 1. Therefore if $S_n$ moves from a negative value to a positive value during $n=n_1 \sim n_2$, then it must attain at some $n$ between $n_1 \sim n_2$. The same applies if $S_n$ moves from a positive value to a negative value.

Assuming both $\limsup S_n\sqrt{n} = \infty$ and $\liminf S_n=\infty$ means $S_n$ fluctuates between positive values and negative values (with large absolute values) infinitely many times. That means $S_n$ attains $0$ infinitely many times due to the observation above. Therefore $A \subset B$.