Questions on the proof of the strong law of large numbers

216 Views Asked by At

Here is the proof of the strong law of large numbers presented in the textbook I'm currently using.

Let $X_1,X_2,...$ be a sequence of independent identically distributed random variables and assume that $E[(X_i)^{4}] < \infty$. Prove the strong law of large numbers.

We note that the assumption $E[(X_i)^{4}] < \infty$ implies that the expected value of the $X_i$ is finite. Indeed, using the inequality $|x| \leq 1 + x^{4}$, we have $E[|X_i|] \leq 1 + E[(X_i)^{4}] < \infty$. Let us assume first that $E[X_i]=0$. We will show that $$E[\sum_{n=1}^{\infty} \frac{(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}}{n^{4}}] < \infty$$ We have $$E[\frac{(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}}{n^{4}}] = \frac{1}{n^{4}} \sum_{i_1=1}^{n} \sum_{i_2=1}^{n} \sum_{i_3=1}^{n} \sum_{i_4=1}^{n} E[X_{i_1}X_{i_2}X_{i_3}X_{i_4}]$$

Let us consider the various terms in this sum. If one of the indices is different from all of the other indices, the corresponding term is equal to zero. For example, if $i_1$ is different from $i_2,i_3,i_4$, the assumption of $E[X_i]=0$ yields, $$E[X_{i_1}X_{i_2}X_{i_3}X_{i_4}] = E[X_{i_1}]E[X_{i_2}X_{i_3}X_{i_4}]=0$$ Therefore, the nonzero terms in the above sum are either of the form E[(X_i)^{4}] (there are n such terms), or of the form $E[(X_i)^{2}(X_j)^{2}]$, with $i \neq j$. We conclude that there are $n$ terms of the first type and $3n(n - 1)$ terms of the second type. Thus $$E[(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}]=nE(X_1)^{4}+3n(n-1)E[(X_1)^{2}(X_2)^{2}]$$ Using the inequality $xy \leq (x^{2}+y^{2})/2$, we obtain $E[(X_1)^{2}(X_2)^{2} \leq E[(X_1)^{4}]$ and $E[(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}] \leq 3n^{2}E[(X_1)^{4}]$.

It follows that $$E[\sum_{n=1}^{\infty} \frac{(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}}{n^{4}}] = \sum_{n=1}^{\infty} \frac{1}{n^{4}} E[(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}] \leq \sum_{n=1}^{\infty}\frac{3}{n^{2}}E[(X_1)^{4}] < \infty$$

This implies that $(X_1+ \cdot \cdot \cdot +X_n)^{4}/n^{4}$ converges to zero with probability 1 and therefore, $(X_1+ \cdot \cdot \cdot+ X_n)/n$ also converges to zero with probability 1 , which is the strong law of large numbers.

Here are my questions about this proof :

1.Where does the assumption of $E[(X_i)^{4}] < \infty$ comes from ? It just feels so out of the blue.

  1. For the part where they used the inequality $xy \leq (x^{2}+y^{2})/2$ to obtain $E[(X_1)^{2}(X_2)^{2} \leq E[(X_1)^{4}]$, is it that they first squared both sides of the inequality then because the r.vs are identical the right side of the inequality could be written as $\frac{(2x^{2})^{2}}{4}=x^{4}$ ?

  2. Could someone explain how the monotone convergence theorem allows for such an equality $E[\sum_{n=1}^{\infty} \frac{(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}}{n^{4}}] = \sum_{n=1}^{\infty} \frac{1}{n^{4}} E[(X_1+X_2+ \cdot \cdot \cdot X_n)^{4}]$. Reason being that the book said the theorem is beyond the scope of the book, so there isn't much about it.

  3. How is this implication established ? What allows this to be concluded ? $(X_1+ \cdot \cdot \cdot +X_n)^{4}/n^{4}$ converges to zero with probability 1 and therefore, $(X_1+ \cdot \cdot \cdot+ X_n)/n$ also converges to zero with probability 1.

Thank you and I really appreciate any help given ! :)

1

There are 1 best solutions below

0
On BEST ANSWER

The whole argument here depends on $E(X^4)<\infty$. The conclusion is true with only $E(|X|)<\infty$ but much harder to prove.

Applying $xy\le\frac12(x^2+y^2)$ to $x=X_1^2$ and $y=X_2^2$ and taking expectation gives $$E(X_1^2X_2^2)\le\frac{E(X_1^4)+E(X_2^4)}2.$$ As the variables are identically distributed, $E(X_2^4)=E(X_1^4)$.

MCT implies that for variables $Y_1,Y_2,\ldots$ with each $Y_n\ge0$ then $$E\left(\sum_{n=1}^\infty Y_n\right)=\sum_{n=1}^\infty E(Y_n).$$ It's like integrating a sum of non-negative functions.

From the sum of the expectations being finite, one gets via a Borel-Cantelli argument that $n^{-4}(X_1+\cdots+X_n)^4\to0$ almost surely, and that is exactly the same as $n^{-1}(X_1+\cdots+X_n)\to0$.