Where does the $\sqrt{N}$ come from in standard error of the mean formula??

255 Views Asked by At

Where does the $\sqrt{N}$ come from in standard error of the mean formula?? When calculating z scores for multiple samples and want to describe the standard deviation of those sample means I know the formula is z = $\frac{(x - \mu)}{\frac{\sigma}{\sqrt{N}}}$ where N is our sample size. Intuitively why does dividing by $\sqrt{N}$ make sense??

1

There are 1 best solutions below

1
On

The intuitive part is that the average of $n$ values is less variable than any single observation from the population.

Suppose you are sampling from a population that has variance $\sigma^2.$ That is $V(X_i) = \sigma^2.$ However, $V(\bar X) = \sigma^2/n.$ So the variance of $\bar X$ does decrease as $n$ increases, which matches intuition.

Then it follows that the "standard error of the mean" is $$SD(\bar X) = \sqrt{V(\bar X)} = \sqrt{\sigma^2/n} = \sigma/\sqrt{n},$$ as in @LarryB's link.

It is not stretching intuition to understand that means are less variable than individual observations. Suppose you are trying to estimate the average weight of melons in a crate; it contains melons of very different sizes. On any one draw from the crate we might get a huge one or a tiny one. But if we draw a dozen melons, it seems likely we will get ones of various sizes and their mean weight will be closer to the mean weight for the crate.

But my intuition does not tell me that the exact relationship must be $V(\bar X) = \sigma^2/n$ for the variance or $SD = \sigma/\sqrt{n}$ for the standard error. For that, I need to do the math. Smaller, yes intuitive; divide exactly by $\sqrt{n},$ no.