Does variance of sample mean converge to zero?

1.3k Views Asked by At

$n$ random variables $X_1,\ldots,X_n$ are an i.i.d. sample. $\bar X_n$ is the sample mean. $\mu$ is the expectation of distribution. Doesn't guarantee a finite variance.

Does this always hold?

$$E[(\bar X_n-\mu)^2]\rightarrow0 \qquad (n\rightarrow \infty)$$

If yes, in which sense does this hold? (ie. almost surely / in probability / in distribution)

If not, under what condition does this hold? What if we add that $\sigma^2<\infty$ is the variance?

1

There are 1 best solutions below

2
On

If $\operatorname{var}(X_1)<\infty$ then $\operatorname E\left(\left(\overline X_n - \mu\right)^2\right) = \dfrac{\operatorname{var}\left(X_1\right)} n \to 0$ as $n\to\infty.$

However, if $\displaystyle \Pr(X_1\in A) = \int_A \frac{du}{\pi(1+u^2)}$ for every measureable set $A,$ i.e. if $X_1$ has a standard Cauchy distribution, then the distribution of $\overline X_n = (X_1+\cdots+X_n)/n$ is actually that same Cauchy distribution. Its interquartile range is still from $-1$ to $+1$ no matter how big $n$ is.