Expected value of Squared Error

1.1k Views Asked by At

In the book Introduction to Statistical Learning, chapter 2, there is a section about prediction using regression. That is, given input $\mathrm X$ and a known quantitative response $\mathrm Y$, we have the relation

$$ \mathrm Y = f(\mathrm X) + \epsilon $$

where $\epsilon$ is random error term. We can then learn a function $\hat f$ that can predict $\mathrm{\hat Y}$ as $$ \hat Y = \hat f(X) $$

In this case, the expected value of squared error described in the book is: $$ E(Y - \hat Y)^2 = E[f(X) + \epsilon - \hat f(X)]^2 $$ $$ = [f(X) - \hat f(X)]^2 + Var(\epsilon) $$

I am unable to work out exactly how this derivation works using the basic rules about random variables that I know. How does the expected value ($E$) on the right hand side disappear?