Please could I have some help with the following question? My initial way of thinking was that Ui must be less than $5$ so that the measurement of the melting point is within $5 $ degrees of $c$, so I set up the inequality
$P(|U - 0n| < 5) ≥ \frac{1}{5^2 391n}$ , where 1/5^2 * 391n is equal to 90%.
However solving this to find n does not give me the integer answer the question asks for, so not sure what to do next.
"You want to determine the melting point c of a new material. You have n specimens on each of which you make a measurement of the melting point in degrees Kelvin, giving you a dataset m 1 ,…, m n . We model this with random variables M i =c+ U i , where U i is the random measurement error. It is known that E[ U i ]=0 and Var ( U i )=391 for each i , and that we may consider the random variables M 1 , M 2 ,… as independent. According to Chebyshev's inequality, how many measurements do you need to perform to be 90% sure that the average of the measurements is within 5 degrees of c ?"
$$Var\left(\frac{\sum_{i=1}^n M_i}{n} \right)=Var\left(\frac{\sum_{i=1}^n U_i}{n} \right)=\frac{Var(U_1)}{n}$$
$$P\left( \left|\frac{\sum_{i=1}^n M_i}{n}-c \right| \ge \frac{k\sqrt{Var(U_1)}}{n}\right) \le \frac1{k^2}$$
If $\epsilon = \frac{k\sqrt{Var(U_1)}}{n}$, then $k = \frac{\epsilon n}{\sqrt{Var(U_1)}}$. Hence ,
$$P\left( \left|\frac{\sum_{i=1}^n M_i}{n}-c \right| \ge \epsilon\right) \le \frac{Var(U_1)}{\epsilon^2n^2}$$
$$P\left( \left|\frac{\sum_{i=1}^n M_i}{n}-c \right| < \epsilon\right) \ge 1- \frac{Var(U_1)}{\epsilon^2n^2}$$
Hence, we want $$1- \frac{Var(U_1)}{\epsilon^2n^2} \ge 0.9$$