What is the value of mean so that the probability of random sample mean falling in the observed confidence interval is 95%?

33 Views Asked by At

Question: The expression we drive for the 95% confidence interval involves Sn=1/n* sum Xi, i=1 to n, which is a random variable. So, the "location" of the confidence interval is a random variable and so it may or may not cover mean_x. However when we actually compute a confidence interval from data, it's no longer a random variable, it's called the "observed" confidence interval, and it's computed as X+-1.96*(s_x/sqrt(n)). What would the value of mean_x have to be so that the probability of the random sample mean falls into the observed confidence interval is 95%? Show work.

I am confused about what the question is asking. but this is my work↓

Suppouse for X1, X2,..., Xn have a sample mean of mean_s and standard deviation of std_s.

And sample population of n.

Then mean_s-1.96*std_s<mean_x<mean_s+1.96*std_s for all sample mean to be included in the 95% confidence interval.

Since std_s=s_x/sqrt(n)

So am I computing the interval of mean_x?