law of large number modified statement

196 Views Asked by At

The weak law of large number states that, given $Y_n = \sum_{k=1}^{n} X_k$, where $X_k$ are random variables independent and identically distributed with finite expectation $\mu$, $$ \forall \delta>0, \forall \epsilon>0 \, \, \exists N>0\, \,\, s.t.\, \, \, P ( |Y_n/n - \mu| > \delta ) < \epsilon, $$ From this statement, is there a simple way to prove that there is a nonzero probability that $$ \forall n>0 \, \, \, \, \, \, \, \, \, \, Y_n/n - \mu > 0 \, \, \, \,? $$ (Note: without absolute value in the difference).

1

There are 1 best solutions below

0
On BEST ANSWER

No, this is false.

A trivial counterexample to the statement as written (with strict inequality) is $X_k = 0$ for all $k$.

If we replace the strict by a weak inequality, it is still false in all nontrivial cases:

If $X_k$ has a nonconstant distribution, then almost surely, $Y_n / n - \mu > 0$ for infinitely many $n$, and $Y_n / n - \mu < 0$ for infinitely many $n$.

Without loss of generality, we can assume $\mu = 0$ (replace $X_k$ by $X_k - \mu$). We will show that almost surely, $\limsup Y_n = +\infty$ and $\liminf Y_n = -\infty$; in particular, $Y_n$ takes on each sign infinitely often. We use the Hewitt-Savage zero-one law, which says that any event that is invariant under finite permutations of the $X_k$ has probability 0 or 1. In particular, any event of the form $\{\limsup Y_n = a\}$ or $\{\limsup Y_n > a\}$ has this property (any permutation of $X_1, \dots, X_m$ leaves $Y_n$ unchanged for $n \ge m$, because addition is commutative).

First, note that the event $\{\limsup Y_n = -\infty\}$ has probability 0 or 1 by Hewitt-Savage. By symmetry, $\{\liminf Y_n = +\infty\}$ has the same probability, and they are mutually exclusive, so the probability cannot be 1 and must be 0. Thus $\limsup Y_n > -\infty$ almost surely. Therefore, there exists $a > -\infty$ such that $P(\limsup Y_n > a) > 0$; by Hewitt-Savage, $P(\limsup Y_n > a) = 1$. Let $Y_n' = X_2 + \dots + X_n$, so that $Y_n'$ has the same distribution as $Y_{n-1}$. In particular, $P(\limsup Y_n' > a) = P(\limsup Y_n > a) = 1$. Since $X_k$ has a nonconstant distribution with mean zero, there exists $\epsilon>0$ such that $P(X_1 > \epsilon) > 0$. So we have $$\begin{align*} P(\limsup Y_n > a + \epsilon) &\ge P(X_1 > \epsilon, \limsup Y_n' > a) \\ &= P(X_1 > \epsilon) P(\limsup Y_n' > a) > 0 \end{align*}$$ where we used the fact that $X_1$ and $\{Y_n'\}$ are independent. Using Hewitt-Savage a final time, we have shown $P(\limsup Y_n > a+\epsilon) = 1$. Iterating this, we have $P(\limsup Y_n > b) = 1$ for all $b$, which is to say $\limsup Y_n = +\infty$ almost surely. By symmetry we also have $\liminf Y_n = -\infty$.