The Question
Suppose $y_{i}=x_{i}^{'}\beta+\epsilon_{i}$ for $i=1,...,n,n+1$ where $E[\epsilon|x]=0$ and $E[\epsilon\epsilon^{'}|x]=\sigma^{2}I$. $y_{1}, ..., y_{n}$ are observed as $x_{1}, ..., x_{n}, x_{n+1}$. Let $\hat\beta_{n}$ be the OLS estimator for the first $n$ observations. We want to predict $y_{n+1}$.
Define $\hat y_{n+1} = x_{n+1}^{'}\hat\beta_{n}$, and $y_{n+1}^{*} = x_{n+1}^{'}\beta_{n}$ be the population predictor of $y_{n+1}$. Let $\hat \epsilon_{n+1} = y_{n+1} - \hat y_{n+1}$. Verify if $\hat y_{n+1}$ is unbiased, ie $E[\hat \epsilon_{n+1}|x_{n+1}]=0$.
My Understanding
$$E[\hat \epsilon_{n+1}|x_{n+1}]= E[y_{n+1} - \hat y_{n+1}|x_{n+1}]=E[y_{n+1}|x_{n+1}] - E[\hat y_{n+1}|x_{n+1}]$$
By linearity and strict exogeneity
$$[\epsilon|x]=0\quad we\ know \quad E[y_{i}|X]=x_{i}^{'}\beta$$ thus $$E[y_{n+1}|x_{n+1}] - E[\hat y_{n+1}|x_{n+1}] = x_{n+1}^{'}\beta - x_{n+1}^{'}\hat \beta = x_{n+1}^{'}(\beta - \hat \beta)$$ By LLN(law of large numbers), $$\beta - \hat \beta \stackrel{p}{\longrightarrow}0$$
and so $$E[\hat \epsilon_{n+1}|x_{n+1}] = E[y_{n+1}|x_{n+1}] - E[\hat y_{n+1}|x_{n+1}] = 0$$
Am I doing this in the right way? Is correct to apply $\beta - \hat \beta \stackrel{p}{\longrightarrow}0$ in this case?
Compute $Var(\hat \epsilon_{n+1}|x)$
So far I got $$Var(\hat \epsilon_{n+1}|x) = E[\hat \epsilon_{n+1}^{2}|x]-E[\hat \epsilon_{n+1}|x]^{2}$$
If I attempted a right
$$E[\hat \epsilon_{n+1}|x]=0$), then $Var(\hat \epsilon_{n+1}|x) = E[\hat \epsilon_{n+1}^{2}|x]$$
But then I'm not sure how to proceed from here. Plugging in $\hat \epsilon_{n+1} = y_{n+1} - \hat y_{n+1}$ seems to over-complicate it.