Assume that $(x_1, Y_1), ..., (x_n,Y_n)$ with $Y_i = f(x_i)+ \epsilon_i$, $x_i$ and want to predict a new observation $Y_0 = f(x_0)+ \epsilon_0$ with $Var(\epsilon_i) = 1$ and $Var(\epsilon_0) = 1$ with $\epsilon_0$ independent of $\epsilon_1, ..., \epsilon_n$. Does there exist any estimator $\hat{Y}_0$ which only depends on the data $(x_1, Y_1), ..., (x_n,Y_n)$ and satisfies $\mathbb{E}[(\hat{f}(x_0)-Y_0)^2] = \dfrac{1}{2}$? Why or why not?
I am having trouble telling which are fixed and which are random. Also, any suggestions on how to begin would be great. Thanks!
Hint: Carefully use the assumptions in the problem to show $$E[(\hat{f}(x_0) - f(x_0))(f(x_0) - Y_0)]=0.$$ Consequently, you can show that your expectation can be decomposed as $$E[(\hat{f}(x_0)-Y_0)^2] = E[(\hat{f}(x_0) - f(x_0))^2] + E[(f(x_0)-Y_0)^2].$$ Now simplify the second term on the right-hand side.