Mean deviation ,Statistics

20 Views Asked by At

A student repeats experiments for the period of a pendulum with the same stopwatch . His measurements vary among one another by (1/10) seconds on the average . How many times must he repeat the experiment in order to determine the period to am accuracy of (1/100) seconds .

I am having no clue on how to proceed to this question .

1

There are 1 best solutions below

2
On

This may be a 'drill' problem from a statistics textbook. If $\sigma^2 = 1/10,$ and the data can be assumed to be normal, then a 95% confidence interval (CI) for the true mean period is $\bar X \pm 1.96\sigma/\sqrt{n}.$

The quantity $1.96\sigma/\sqrt{n}$ (half the width of the CI) is called the 'margin of error'. Maybe your problem wants the margin of error to be 1/100. If so, solve $1.96\sigma/\sqrt{n} = 1/100$ for $n.$

Disclaimer: I am only suggesting this because my guess above is a typical kind of problem in statistics courses when CIs are under discussion. However, typically the problems are much more clearly stated than this one, so there is no way for me to know if this is actually what is required.