Calculating standard deviation from a set of data

39 Views Asked by At

I'm trying to create a normal distribution of numbers between 0 and 100. I know that the mean = 28, and the only other information about the data is that there is a 10 % change that the number is 44, and a 1 % chance of it being 74. Other then that it should be distributed around the mean.

So my question is how do I calculate the standard deviation based on this?

Thanks in advance!

edit: chance**

1

There are 1 best solutions below

0
On

You should use a normal distribution table (or the calculator equivalent) to determine the z-score associated with 99%, that is, $P(z<Z)=0.99$ means $P(z>Z)=0.01$. Then solve $Z=(74-28)/\sigma$ for $\sigma$.