Law of Large Numbers - utility/difficulty of various versions.

140 Views Asked by At

This may or may not be an answer to Is there an easy proof that the set of $x \in [0,1]$ whose limit of proportion of 1's in binary expansion of $x$ does not exist has measure zero?, depending on how easy a proof has to be to count as easy.

The law of large numbers comes in various versions:

LLN${}_p$ Suppose $X_1,\dots$ are iid with $\Bbb EX_1=0$ and $\Bbb E|X_1|^p<\infty$. Then $\frac1n(X_1+\dots+X_n)\to0$ almost surely.

Proving LLN${}_1$ is considerably harder than proving LLN${}_2$. My impression is that this gives LLN${}_2$ a higher utility/difficulty ratio, since it suffices for many applications. Seems to me that LLN${}_4$ has yet a much higher utility/difficulty ratio. There's an awesomely easy proof of LLN${}_4$. The obligatory question then is this:

Question Are there a lot of standard probability distributions out there, that actually come up, that give $\Bbb E X^4=\infty$ but $\Bbb E X^2<\infty$?

Hoping for an answer to that question is why I'm posting this. The idea that people might be amused by the proof of LLN${}_4$ is absolutely no part of my motivation, I swear; that would not be a legal reason for this post's existence.

Of course you need to see the proof I have in mind in order to evaluate that utility/difficulty ratio:

Say $S_n = X_1+\dots+X_n$. Multiply out the product $S_n^4$: $$S_n^4=\sum X_{j_1}X_{j_2}X_{j_3}X_{j_4}.$$Independence shows that most of those terms have mean $0$; the only terms with non-zero mean are of the form $X_j^4$ or a "permutation" of $X_j^2X_k^2$. A trivial bit of combinatorics shows then that $$\Bbb ES_n^4\le cn^2.$$

Monotone convergence shows that $$\Bbb E\left[\sum\left(\frac 1n S_n\right)^4\right]<\infty.$$Hence $\sum\left(\frac 1n S_n\right)^4<\infty$ almost surely, so $\frac1nS_n\to0$ almost surely.

(Note I'm not claiming to be smart here; I saw this somewhere many years ago.)

1

There are 1 best solutions below

1
On

The keyword is "fat-tail" distributions. So anything that falls off like $1/x^p$ as $x\rightarrow\infty$ will have all moments up to and including $p-2$. These come up more often than people expect, especially in finance and actuarial sciences and are absolutely catastrophic when it comes to (underestimating) rare events [read something like Black Swan by Taleb], especially if it's something like the Cauchy distribution $\frac{1}{\pi} \frac{1}{x^2+1}$ which has no mean in the first place. Something like $1/x^{2+\epsilon}$ for small $\epsilon$ will also be a headache numerically.

For your benefit, I suggest writing a simple program that samples from the Cauchy distribution to see how the LLN fails. Try a similar exercise with the extra $\epsilon$ as well.