The expression comes from The Elements of Statistical Learning p26(2.27).
Following picture is our tearcher's slide:
I don't understand the red underlined. Isn't that the expectation of $\hat{y}_0$ is $f(x_0)$? I think: $E(\hat{y}_0)=E(x_0^T\hat{\beta})=x_0^T\beta=f(x_0)$. So why $E_{\mathcal{T}}(f(x_0)-\hat{y_0})^2=Var_{\mathcal{T}}(\hat{y_0})+Bias^2(\hat{y}_0)$? Isn't it only equals the Var? Why there is also a Bias.
Maybe a silly problem...But I feel so confused. I need your help!thank you!
I think in general the expression theyve written is correct, since the bias is not always $0$. But in this case (you can see on the next line) the bias is $0$.