Normal Distribution in

123 Views Asked by At

I am so confused with this problem:

The middle 95% of adults have an IQ between 60 and 140. Assume that IQ for adults is normally distributed. a. What is the average IQ for adults? The standard deviation?

I got the average by subtracting the values given and then multiply it with 95%. But I dont know how to get the standard deviation because a certain number of population isn't given. Any ideas anyone?

2

There are 2 best solutions below

0
On

Since it is the "middle" $95\%$, meaning with with equal tails of $2.5\%$ on each side, the mean must be $\frac{60+140}{2}$. (Recall that the normal distribution is symmetric about the mean.)

From tables of the standard normal, the point which has $2.5\%$ in the right tail is $1.96$ standard deviation units from the mean. So if $\sigma$ is the population standard deviation, then $1.96\sigma=40$.

0
On

Actually the average IQ is 100 and its standard deviation is 15.

Intelligence tests are scored in such a way the resulting IQ distribution conform to these properties.

http://en.wikipedia.org/wiki/Intelligence_quotient