An atlete specialized in long jump events jumps an average of $\bar x=7.91m$ in $12$ trials. The standard error of the mean jump distance in these trials is $0.2m$.
Is it plausible that when the atlete performs $ 10000 $ jumps, the average distance of these jumps is $8.05m $ or more?
Do you have any idea how this problem can be solved?
Thanks in advance
As I see it, there are two parts to a solution.
For both parts, there is a trade-off between simplicity and precision. For the first part, we could either use maximum likelihood estimation to figure out which values of $\mu$ and $\sigma^2$ maximize the joint likelihood of all three of the given data points (more difficult), take $\mu = 7.91, \sigma = 0.2$ as given (easier), or find a compromise. For the second part, you have to figure out what possible measurements are "as unusual as what we observed," and compute the probability that this happens. All approaches will model the situation using $\mu$ and $\sigma^2$ from part 1. The more difficult approach is to figure out what possible values of our three data points have likelihood lower than the ones we actually observed (and thereby compute the probability that the likelihood is that low), and the easier approach is just to compute the probability that the (absolute value of) the difference in the two averages is as high as it is.