Let n be an independent random variables and the number of orders in a 120 minute period.
Given that $\mu$ is 1.5 minutes and that $\sigma$ is 1 minute use the Central Limit Theorem to find the Largest Value of n which gives a 95% chance of completion in that time.
I understand that the mean time per order is 1.5 minutes with a deviation of 1 minute, but I am unsure how to find this without using trial and error.
I'm thinking that it's something along the lines of
$0.95=P(\frac{X-1}{1.5}<\frac{x_{0.95}-1}{1.5})$
How does this differ from the Chebyshev Inequality in terms of result?