A student repeats experiments for the period of a pendulum with the same stopwatch . His measurements vary among one another by (1/10) seconds on the average . How many times must he repeat the experiment in order to determine the period to am accuracy of (1/100) seconds .
I am having no clue on how to proceed to this question .
This may be a 'drill' problem from a statistics textbook. If $\sigma^2 = 1/10,$ and the data can be assumed to be normal, then a 95% confidence interval (CI) for the true mean period is $\bar X \pm 1.96\sigma/\sqrt{n}.$
The quantity $1.96\sigma/\sqrt{n}$ (half the width of the CI) is called the 'margin of error'. Maybe your problem wants the margin of error to be 1/100. If so, solve $1.96\sigma/\sqrt{n} = 1/100$ for $n.$