I am so confused with this problem:
The middle 95% of adults have an IQ between 60 and 140. Assume that IQ for adults is normally distributed. a. What is the average IQ for adults? The standard deviation?
I got the average by subtracting the values given and then multiply it with 95%. But I dont know how to get the standard deviation because a certain number of population isn't given. Any ideas anyone?
Since it is the "middle" $95\%$, meaning with with equal tails of $2.5\%$ on each side, the mean must be $\frac{60+140}{2}$. (Recall that the normal distribution is symmetric about the mean.)
From tables of the standard normal, the point which has $2.5\%$ in the right tail is $1.96$ standard deviation units from the mean. So if $\sigma$ is the population standard deviation, then $1.96\sigma=40$.