Finding the standard deviation of a sample taken from a population with a uniform distribution

36 Views Asked by At

Consider a standard uniform density. The mean for this density is 0.5 and the variance is 1/12. You sample 1,000 observations from this distribution and take the sample standard deviation.

What value would you expect it to be near?


I got this problem wrong, and unfortunately the solution I was provided gave no clearer explanation than $1/\sqrt{12}$ is the sample standard deviation because it is "consistent".

I was originally expecting it to be equal to the population standard deviation divided by the square root of the sample, $1/\sqrt{12*1000}$, since I am currently learning about the Central Limit Theorem and Normal Distributions, and I was under the impression that this is typically how one estimates the sample standard deviation in this case.

Can someone provide a better explanation for the correct solution?

1

There are 1 best solutions below

1
On BEST ANSWER

You are confusing the sample standard deviation for the standard error of the sample mean. They are not the same thing.

As you have already stated, the variance is $\sigma^2 = 1/12$. Therefore the standard deviation is $\sigma = 1/\sqrt{12}$. Now, if we calculate the sample standard deviation $$s = \sqrt{\frac{1}{n-1} \sum_{i=1}^n (x_i - \bar x)^2},\tag{1}$$ this will tend toward the true standard deviation as the sample size increases, because $s$ is an estimator of $\sigma$.

If on the other hand we wanted to compute the standard error of the sample mean, this is $\sqrt{\operatorname{Var}[\bar X]}$; that is to say, it is the square root of the variance of the sample mean. The standard error reflects how much variability the sample mean has as an estimator, and therefore should get smaller (closer to $0$) as the sample size increases.

We will illustrate the difference with the following example. I generate a sample of size $n = 10$:

$$\{0.692785, 0.0670644, 0.149791, 0.532492, 0.95614, 0.951413, \ 0.310659, 0.0554809, 0.164028, 0.900758\}.$$

The sample mean is $\bar X = 0.478061$. The sample standard deviation is $$s = 0.374501.$$ The standard error of the sample mean is

$$SE = s/\sqrt{n} = 0.118428.$$

Note that $1/\sqrt{12} \approx 0.288675$. Now suppose I generate a sample of size $n = 1000$ (I will not show it here for obvious reasons). The sample mean I get is $\bar X = 0.483906$, the sample standard deviation is $s = 0.286231$, and the standard error of the sample mean is $SE = 0.00905142$.

$s$ represents an estimate of how much variation is present in the outcomes of the uniform distribution, but $SE$ represents an estimate of how much variation is present in the mean of a sample drawn from the distribution.