Unbiased Estimate of Variance

137 Views Asked by At

Consider a simple linear regression model for $n$ observations where $$Y_i = \beta_1 X_i + \epsilon_i$$ where $\epsilon_i \sim N(0,\sigma^2).$ I want to show that $$\hat{\sigma}^2 = \frac{1}{n-2} \sum_{i=1}^n \hat{\epsilon}_i^2,$$ is an unbiased estimate of $\sigma^2$, where $\hat{\epsilon}_i$ is an unbiased estimate of $\epsilon_i.$ The $n-2$ confused me enough, but when it came to calculating $E[\hat{\sigma}^2]$, I found that $$E[\hat{\sigma}^2] = E\left[ \frac{1}{n-2} \sum_{i=1}^n \hat{\epsilon}_i^2 \right]$$ $$=\frac{1}{n-2} \sum_{i=1}^n E[\hat{\epsilon}_i^2] = \frac{1}{n-2} \sum_{i=1}^n Var[\hat{\epsilon}_i] + (E[\hat{\epsilon}_i])^2,$$ which uses the definition of variance. However, I am having some difficulty figuring out $Var[\hat{\epsilon}_i]$, although I am rather sure that $E[\hat{\epsilon}_i] = \epsilon_i$ given unbiasedness. Note that we have defined $\hat{\epsilon}_i = (Y_i-\hat{Y}_i)^2,$ where $\hat{Y}_i = \hat{\beta}X_i$ $ \forall i \in [n].$ Any recommendations on this?

1

There are 1 best solutions below

0
On BEST ANSWER

The standard assumptions on the noise/error term are $\epsilon \sim \mathcal{N} (0, \sigma^2)$. Hence, as was mentioned by @Ian in the comment, $$ E\hat{\epsilon}_i = E(y_i - \hat{y}_i) = E(y_i) - E(\hat{\beta} x_i)=\beta x_i - \beta x_i=0. $$