Why the standard deviation of the sample mean is calculated as $\frac{\sigma}{\sqrt{n}}$?

473 Views Asked by At

According to Wikipedia, the standard deviation of a sample mean is calculated as follows $$\frac{\sigma}{\sqrt{n}}$$

Why is that? Why do we need to divide the standard deviation of the population by the square root of $n$ (which should I think be the size of the sample)? Why should that make sense?

1

There are 1 best solutions below

1
On BEST ANSWER

The sample mean is defined by $$ \overline X=\frac1n\sum_{k=1}^nX_k. $$ If $X_1,\ldots,X_n$ are indepedent and identically distributed random variables, the variance of the sample mean is given by $$ \operatorname{Var}\overline X=\frac1{n^2}\cdot n\operatorname{Var}X_1=\frac{\sigma^2}n, $$ where $\sigma^2=\operatorname{Var}X_1$. The standard deviation is then given by $$ \sqrt{\operatorname{Var}\overline X}=\frac{\sigma}{\sqrt n}. $$