$\{X_n,n>1\}$i.i.d,$S_n=X_1+\cdots+X_n$,$E(X_1) = 0$.Prove that $E|S_n|/n→0$

150 Views Asked by At

Suppose $\{X_n,n>1\}$i.i.d,$S_n=X_1+\cdots+X_n$,$E(X_1) = 0$.Prove that $E|S_n|/n→0$.

Attempts:$\{X_n^+\}$ i.i.d and $\{X_n^-\}$ i.i.d,we have $E(|S_n|)$=$E|\sum_i X_i^+-\sum_j X_j^-|$ and $E(\sum_i X_i^+)=E(\sum_j X_j^-)$,and from Strong Law of Large Numbers,we have $\sum_n X_n^+/n\to E(X_1^+)$ a.e and $\sum_n X_n^-/n\to E(X_1^-)$ a.e,but there are still gaps between the final result.

2

There are 2 best solutions below

1
On

You can use the following fact to prove it:

Proposition

Let $\{X_n\}_{n\in\mathbb{N}}$, $\{Y_n\}_{n\in\mathbb{N}}$ be sequences of $\mathbb{R}$-valued random variables such that $\lvert X_n \rvert \leq Y_n$ a.s. for all $n \in \mathbb{N}$. Let $X$, $Y$ be $\mathbb{R}$-valued random variables such that $\lim_{n \to \infty} X_n = X$, $\lim_{n \to \infty} Y_n = Y$ a.s.. Suppose that $\lim_{n \to \infty} \mathbb{E}[Y_n] = \mathbb{E}[Y] < \infty$. Then, $$ \lim_{n \to \infty} \mathbb{E}[X_n] = \mathbb{E}[\lim_{n \to \infty} X_n] = \mathbb{E}[X]. $$

This is a version of Proposition A.58. in Theory of Statistics, by Mark J. Schervish rewritten in the terms of probability theory. And the proof of this proposition can be done in the same way as the Dominated Convergence Theorem.

Proof of your question

Let $Y_n = \sum_{i=1}^{n} \lvert X_n \rvert / n$. By the trigonometric inequality, we have $\lvert S_n \rvert / n \leq \sum_{i=1}^{n} \lvert X_n \rvert / n = Y_n$. And by the Strong Law of Large Numbers and the continuity of the absolute function, we have

$$ \frac{\lvert S_n \rvert}{n} = \left\lvert \frac{S_n}{n} \right\rvert \stackrel{a.s.}{\to} \lvert \mathbb{E}[X_1] \rvert = 0, ~Y_n = \frac{1}{n} \sum_{i=1}^{n} \lvert X_n \rvert \stackrel{a.s.}{\to} \mathbb{E}[\lvert X_1 \rvert]. $$

Furthermore, by the linearity of the expectation, it follows that $\mathbb{E}[Y_n] = \mathbb{E}[\lvert X_1 \rvert] \to \mathbb{E}[\lvert X_1 \rvert] < \infty ~(n \to \infty)$.

Therefore, by the Proposition, we have $$ \lim_{n \to \infty} \mathbb{E}\left[ \frac{\lvert S_n \rvert}{n} \right] = \mathbb{E}\left[ \lim_{n \to \infty} \frac{\lvert S_n \rvert}{n} \right] = 0. $$

2
On

Here is a reply to @Snoop's comment. The claim can be regarded as the $L^1$ version of LLN. The following proof does not use LLN. We assume $X_1 \in L^1$.

We use a truncation argument. For $M > 0$ and $i \ge 1$, we let $X_i^{(M)} := ((-M) \vee X_i) \wedge M$.

Step 1. For fixed $M > 0$, consider $X_i^{(M)}, i \ge 1$. They are i.i.d., so we have that $$E\left[\left(\sum_{i=1}^{n} (X_i^{(M)} - E[X_i^{(M)}]) \right)^2\right] = n \text{Var}(X_1^{(M)}) \le n M^2.$$

Hence $$ \lim_{n \to \infty} E\left[ \left(\frac{1}{n}\sum_{i=1}^{n} X_i^{(M)} - E\left[X_1^{(M)}\right]\right)^2\right] = 0.$$ Hence, $$ \lim_{n \to \infty} E\left[ \, \left|\frac{1}{n}\sum_{i=1}^{n} X_i^{(M)} - E\left[X_1^{(M)}\right]\right| \, \right] = 0.$$

Step 2. We deal with $X_i - X_i^{(M)}, i \ge 1$. By the triangle inequality, $$E\left[ \, \left|\frac{1}{n}\sum_{i=1}^{n} (X_i - X_i^{(M)})\right| \, \right] \le E\left[\left|X_1 - X_1^{(M)}\right|\right].$$

Therefore, for each $M > 0$, $$ \limsup_{n \to \infty} E\left[ \left| \frac{S_n}{n} - E\left[X_1^{(M)}\right] \right| \right] \le E\left[\left|X_1 - X_1^{(M)}\right|\right].$$

Furthermore, by recalling that $E[X_1] = 0$, $$ \limsup_{n \to \infty} E\left[ \left| \frac{S_n}{n} \right| \right] \le 2E\left[\left|X_1 - X_1^{(M)}\right|\right].$$

We have that $\left|X_1 - X_1^{(M)}\right| \le |X_1|$, $X_1^{(M)} \to X_1$, $M \to \infty$, and recall that $X_1 \in L^1$. Now we can apply the Lebesgue convergence theorem and obtain that $$ E\left[\left|X_1 - X_1^{(M)}\right|\right] \to 0, M \to \infty. $$

Thus we obtain the assertion.