While doing my homework and checking my answers with the book's answers I noticed that sometimes the standard deviation is divided by $\sqrt n$ where $n$ is the sample size. I'm a little confused. For my current problem I am trying to find the estimated standard error of the estimator. I had found in a previous part of the problem that $\hat \sigma=.33853$ and the sample consists of $16$ measurements. Now the standard error is $.084633$ which is indeed $\frac{\hat \sigma}{\sqrt{16}}$. When I found the standard deviation I didn't divide by $4$, so whats different this time?
2026-04-07 14:39:16.1775572756
On
On
What situation calls for dividing the standard deviation by $\sqrt n$?
109k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
3
There are 3 best solutions below
0
On
$$\sigma_{xbar} = \sigma/n^{1/2}$$
This is used to find the standard deviation of a an xbar distribution.
0
On
I found this paper that could be helpful to some of you: why do we divide by square root of n
It provides both proof and intuition behind the reason there is a difference between the standard deviation of a sample point and the population mean.
In the normal distribution, if the expectation of the average of a sample size n is the same as the expectation, however, the standard deviation of your sample is to be divided by the square root of your sample size. You may read about Square Root n Law or Central Limit theorem, which should be in your stats book somewhere.