Why is sampling distribution of means's standard deviation sd/sqrt(n)?

506 Views Asked by At

I'm so used to seeing standard deviations being the sqrt of the entire formula so it's weird to me to see a SD that is defined as sd/sqrt(n). Can someone explain why?

1

There are 1 best solutions below

0
On

It is the standard error of the mean which is $\sigma/\sqrt{n}$

If a sample of $n$ i.i.d. random variables each have mean $\mu$ and variance $\sigma^2$

  • then their sum has mean $n\mu$ and variance $n\sigma^2$ since they are independent

  • so their mean has mean $\dfrac{n \mu}{n}=\mu$ and variance $\dfrac{n\sigma^2}{n^2}=\dfrac{\sigma^2}{n}$ since you are dividing by a constant

  • leading to their mean having standard deviation $\sqrt{\dfrac{\sigma^2}{n}} = \dfrac{\sigma}{\sqrt{n}}$ by taking the square root of the variance