Mean Square Error of using average to predict $X_{n+1}$

343 Views Asked by At

Let $X_1, X_2, \cdots, X_n$ be a sample from a population with mean $\mu$ and variance $\sigma^2$. Using $\bar{X_n} = \sum_{i=1}^n X_i / n$ as a predictor for $X_{n+1}$, determine the mean square error of this predictor.

I was able to solve this problem by directly calculating $E[(X_{n+1} - \bar{X_n})^2]$ and the answer is $(1+1/n)\sigma^2$. However, if I try to calculate with $\text{MSE = Bias + Variance}$, I get a wrong answer. I think since $\bar{X_n}$ is unbiased, the MSE equals variance, which in this case is

$$V(\bar{X_n}) = \frac{1}{n^2} \sum V(X_i) = \frac{\sigma^2}{n}$$

Why is this wrong?

1

There are 1 best solutions below

1
On BEST ANSWER

Assuming that $X_1,X_2\ldots, X_{n+1}$ is a random sample, $$ \mathsf{E}[X_{i+1}-\bar{X}_n]^2=\mathsf{E}[X_{i+1}-\mu-\bar{X}_n+\mu]^2=\operatorname{Var}(X_{i+1})+\operatorname{Var}(\bar{X}_n). $$ Note that $X_{n+1}$ is random, and so the "Bias + Variance" formula is not appropriate here.