I'm studying question 16.19 in the book Probability Essentials. The question is to show that: $\operatorname{Var}(\hat\epsilon_i)=\sigma^2({{n-1}\over n}-{{(x_i-\bar x)^2}\over{\sum_{i=1}^n}(x_i-\bar x)^2})$.
where $\hat\epsilon_i=Y_i-A-Bx_i$.
{$\epsilon_i$} is i.i.d. $N(0,\sigma^2)$.
$A,B$ is the estimator of the for $\alpha$ and $\beta$ in linear regression $Y_i=\alpha+\beta x_i+\epsilon_i$.
\begin{align}
\operatorname{Var}(A)&={\sigma^2\sum x_i^2\over n\sum (x_i-\bar x)^2}. \\
\operatorname{Var}(B)&={\sigma^2\over n\sum (x_i-\bar x)^2}. \\
\operatorname{Cov}(A,B)&={-\sigma^2\bar x\over n\sum (x_i-\bar x)^2}
\end{align}
My attempt:
\begin{align}
\operatorname{Var}(\hat\epsilon_i)&=\operatorname{Var}(Y_i-A-Bx_i)\\
&=\operatorname{Var}(Y_i)+\operatorname{Var}(A+Bx_i)\\
&=\sigma^2+\operatorname{Var}(A)+x_i^2\operatorname{Var}(B)+2x_i\operatorname{Cov}(X,Y)
\end{align}
After the substitution, I still couldn't get the answer. I guess $Y_i$ is not independent with A and B? But what's the correct way to do it?
2026-03-26 07:35:23.1774510523