Let $ X_1 ,..X_n$ iid with $E(X_i)=\mu$ and finite variance $\operatorname{var}(X_i)=\sigma^2$. Show that
$$\frac{1}{n\sigma^2}\sum_{i=1}^n E\left[(X_i-\mu)^2I(|X_i-\mu| \geq \epsilon\sigma (n)^{0.5} \right]$$ goes to zero as n goes to $\infty$.
can anyone help me out to figure out starting point ?
I can see that when n goes to $\infty$ , $\sigma (n)^{0.5}$ goes to $\infty$. so $$E(X_i-\mu)^2I(|X_i-\mu| \geq \epsilon\sigma (n)^{0.5} = 0.$$
But my question is, is that valid for all $n$?
Thank you
Since the random variables are identically distributed, the value
$$\mathbb{E}((X_i-\mu)^2 1_{\{|X_i-\mu| \geq \epsilon \sqrt{\sigma(n)}\}})$$
does not depend on $i$. Hence,
$$\frac{1}{n \sigma^2} \sum_{i=1}^n \mathbb{E}((X_i-\mu)^2 1_{\{|X_i-\mu| \geq \epsilon \sqrt{\sigma(n)}\}}) = \frac{1}{\sigma^2} \mathbb{E}((X_1-\mu)^2 1_{\{|X_1-\mu| \geq \epsilon \sqrt{\sigma(n)}\}})$$
Now it follows from the fact that $X_1 \in L^2$ and the dominated convergence theorem that the right-hand side converges to $0$ as $n \to \infty$.