Calculation of Standard Deviation of Sample Mean

43 Views Asked by At

This is another statistics problem that I have, which I cannot make sense of:

An IQ-test is normal to $ \mu = 100$ and $\sigma = 10$. What is the standard deviation of the sample mean of a sample with size $n = 50$?

We know the mean is $\mu = n \times p$ which we solve for $n=\frac{100}{p}$ and substitute in $Var(X) = n \times p(1-p)$ which is $\sigma^2$ but yet I don't seem to be able to obtain the value of the standard deviation $\sigma$ using this way. The supposed solution of this problem is $1.4142$.

Thanks for your help!

1

There are 1 best solutions below

1
On

The question says normal, as in the normal distribution. The fact that you are applying a formula that relates to the binomial distribution indicates that you do not understand what you are doing or what the formulas you are using actually mean.

Go back and review your textbook and notes pertaining to the following topics:

  • Normal distribution
  • Sample mean
  • Sampling distribution

Then show how you have modified your computation accordingly.