What is the standard deviation of the mean of 10000 computer generated random numbers between 0 and 1?

2.6k Views Asked by At

So this is probably really simple to anyone who's actually taken stats. I want to generate a number of sets of random numbers from 0 to 1, then find the standard deviations of the means.

I'm trying to find out if its worth computing the mean of a set of randoms or to just assume constant value since the SD would be too insignificant.

edit: i know i can simulate it but I'd really rather see the math behind it

3

There are 3 best solutions below

0
On

You can compute the variance of a single random number as it is drawn from a uniform distribution on $(0,1)$. Then the variance of the sum is the sum of the variances, which is just multiplying the variance by the number of samples. The variance of the mean decreases as the inverse of the number of samples.

2
On

If you are generating $n$ numbers between 0 and 1, you can express this as $U_1 , \ldots, U_n$ where each $U_i$ is a Uniform(0, 1) random variable.

In any given sample, the sample mean is $\bar{U} = \frac{\sum_{i=1}^n U_i}{n}$. So the expected value of the sample mean is just the mean (use the linearity of expectation to work it out).

Since the $U_i$ are independent the variance (not SD!) is linear as well. So you have

$Var(\bar{U}) = \frac{1}{n^2} \sum_{i=1}^n Var(U_i)$

The variance of a single Uniform(0,1) is... I could post the answer here, but it's good if you look it up or work it out! After you have the variance, take the square root and that's your standard deviation of the sample means. Next, try some simulations to see that it's right!

0
On

For anyone who wants the more general answer rather than n=10000, I found the variance equation for a continuous uniform distribution on Wikipedia:

$$Var(U(a,b)) = \frac{1}{12}(b-a)^2$$

For U(0,1) that works out to 1/12. Standard deviation is simply the square root of that.

$$\sqrt{\frac{1}{12}} \approx 0.2886751346$$