Standard Error of Coefficients in simple Linear Regression

689 Views Asked by At

In the book "Introduction to Statistical Learning" page 66, there are formulas of the standard errors of the coefficient estimates $\hat{\beta}_0$ and $\hat{\beta}_1$. I know the proof of $SE(\hat{\beta}_1)$ but I am confused about how to derive the formula for $$SE(\hat{\beta}_0)^2 = \sigma^2\left[\frac{1}{n} + \frac{\bar{x}^2}{{\sum_{i=1}^n}(x_i - \bar{x})^2}\right]$$ since $\sigma^2 = Var(\epsilon)$, not the variance of $y_i's$.

My calculation so far is as follows: $$Var(\hat{\beta}_0) = Var(\bar{y} - \hat{\beta}_1\bar{x}) = Var(\bar{y}) + \bar{x}^2\frac{\sigma^2}{{\sum_{i=1}^n}(x_i - \bar{x})^2} - 2\bar{x} Cov(\bar{y}, \hat{\beta}_1) $$in which $\sigma^2 = Var(\epsilon)$.

$Cov(\bar{y}, \hat{\beta}_1) = 0$ since $\bar{y}$ and $\hat{\beta}_1$ are uncorrelated. $Var(\bar{y}) = \frac{\sigma^2}{n}$ in which $\sigma^2 = Var(y_i)$.

So how can we have the formula for $SE(\hat{\beta}_0)^2$ as above since the 2 $\sigma's$ are different from each other?

Many thanks in advance for your help!

1

There are 1 best solutions below

1
On

You have found out that

$$Var(\hat{\beta}_0) = Var(\bar{y} - \hat{\beta}_1\bar{x}) = Var(\bar{y}) + \bar{x}^2\frac{\sigma^2}{{\sum_{i=1}^n}(x_i - \bar{x})^2} - 2\bar{x} \underbrace{Cov(\bar{y}, \hat{\beta}_1)}_{=0}$$ in which $Var(\bar{y}) =\frac{\sigma^2}{n}$.

Inserting $\frac{\sigma^2}{n}$ for $Var(\bar{y}) $

$$Var(\hat{\beta}_0) = \frac{\sigma^2}{n} + \bar{x}^2\frac{\sigma^2}{{\sum_{i=1}^n}(x_i - \bar{x})^2}=\sigma^2 \cdot \left(\frac{1}{n} + \bar{x}^2\frac{1}{{\sum_{i=1}^n}(x_i - \bar{x})^2} \right)$$