The time taken for a computer to connect to a server is normally distributed with a mean value given by 3.3 seconds and a standard deviation of 0.66 seconds.
(a) A computer is said to have a fast connection time if it connects in less than 2.5 seconds. What percentage of computers might one expect to fall into this category?
(b) In a sample comprising 155 computers, how many would be expected to have connection times of over 4 seconds?
(c) What is the connection time for the slowest 5% of the computers?
That's the problem I have to complete. Can anyone explain/show me how to do it?
Thanks :)
You first need to normalize your values, if a computer connects in 2.5 seconds that is 3.3-2.5=0.8 seconds less than the mean. This is 0.8/0.66=1.212 standard deviations less than the mean. You now need to know the area of a standard normal distribution to the left of -1.212 standard deviations.
The function that gives the area of a part of the normal distribution is commonly called the Q function, so you need Q(-1.212). You can ask Wolfram Alpha to calculate this for you, it gives Q(-1.212)=0.112756. So about 11.3% of connections will be "fast". You can use similar reasoning for part B, and part C is the same thing, but backwards.
Note that the Q function Wolfram Alpha uses is actually the inverse of the one defined in the Wikipedia article. You would notice something funny going on because you would get an answer of 88.7%, whereas the average connection is "slow", so that answer wouldn't make sense.