Prove the Lindeberg condition is satisfied.

952 Views Asked by At

Let $ X_1 ,..X_n$ iid with $E(X_i)=\mu$ and finite variance $\operatorname{var}(X_i)=\sigma^2$. Show that

$$\frac{1}{n\sigma^2}\sum_{i=1}^n E\left[(X_i-\mu)^2I(|X_i-\mu| \geq \epsilon\sigma (n)^{0.5} \right]$$ goes to zero as n goes to $\infty$.

can anyone help me out to figure out starting point ?

I can see that when n goes to $\infty$ , $\sigma (n)^{0.5}$ goes to $\infty$. so $$E(X_i-\mu)^2I(|X_i-\mu| \geq \epsilon\sigma (n)^{0.5} = 0.$$

But my question is, is that valid for all $n$?

Thank you

2

There are 2 best solutions below

0
On BEST ANSWER

Since the random variables are identically distributed, the value

$$\mathbb{E}((X_i-\mu)^2 1_{\{|X_i-\mu| \geq \epsilon \sqrt{\sigma(n)}\}})$$

does not depend on $i$. Hence,

$$\frac{1}{n \sigma^2} \sum_{i=1}^n \mathbb{E}((X_i-\mu)^2 1_{\{|X_i-\mu| \geq \epsilon \sqrt{\sigma(n)}\}}) = \frac{1}{\sigma^2} \mathbb{E}((X_1-\mu)^2 1_{\{|X_1-\mu| \geq \epsilon \sqrt{\sigma(n)}\}})$$

Now it follows from the fact that $X_1 \in L^2$ and the dominated convergence theorem that the right-hand side converges to $0$ as $n \to \infty$.

0
On

If $Y$ is a random variable with mean zero and variancee one, it is possible that $\mathbb E\left[Y^2\mathbf 1\left\{\left\lvert Y\right\rvert \gt R\right\}\right]\ne 0$ for all real number $R$. What is true is that $\lim_{R\to +\infty}\mathbb E\left[Y^2\mathbf 1\left\{\left\lvert Y\right\rvert \gt R\right\}\right]=0$. This can be used in this context. First notice that denoting $Y_i:=\left(X_i-\mu\right)/\sigma$, we have $\mathbb E\left[Y_i^2\mathbf 1\left\{\left\lvert Y_i\right\rvert \gt n^{1/2}\varepsilon\right\}\right]=\mathbb E\left[Y_1^2\mathbf 1\left\{\left\lvert Y_1\right\rvert \gt n^{1/2}\varepsilon\right\}\right]$ then sum over $i$, divide by $n$...