In a Linear Regression model like $Y=\beta_0 +\beta_1X+u$ ,How we can prove that: $\mathrm{Var}(\hat\beta_0)={\bar X^2 \sigma^2\over\Sigma x^2}$ which $x=X-\bar X$
2026-04-05 23:06:03.1775430363
variance of intercept parmeter in linear regression model
4.5k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
It's easiest to work in matrices.
$\hat{\beta} =\bigl(\begin{smallmatrix} \hat\beta_0\\ \hat\beta_1 \end{smallmatrix} \bigr)=(X'X)^{-1}X'Y=\beta+(X'X)X'\mu$
where $X = \bigl(\begin{smallmatrix} \bf{1}'\\ \bf{x}' \end{smallmatrix} \bigr)$ , $\bf{1}$ is a nx1 vector of 1's and $\bf{x}$ is an nx1 vector of the x's.
Assuming homoskedasticity, we have that
$Var(\hat{\beta})=(X'X)^{-1}\sigma^2$
where $(X'X)^{-1} = (\begin{smallmatrix} n&\sum x_i\\ \sum x_i&\sum x^2_i \end{smallmatrix} \bigr)^{-1}$
The variance you want is the (1,1) element in this matrix which is $\frac{\sum x^2_i}{n\sum x_i^2 -(\sum x_i)^2}$ Divide both sides by $n^2$ and you get your answer