Show that $\hat{\beta} $ and $\hat{\sigma^2}$ are unbiased in special case of linear regression model

442 Views Asked by At

Let's consider we have OLS model $Y= X\beta+ \epsilon$, where rows of matrix $X$ are multivariate normal independent vectors with expected value $0$ and variance $\Sigma$. Vector $\epsilon$ is independent of X and its mean value is zero ($E(\epsilon) =0$) and $Var(\epsilon)=\sigma^2 I$.

I need to show that $\hat{\beta} = (X^T X)^{-1} X^T Y$ and $\hat{\sigma^2}= \frac{1}{rank(I-H)} \sum_{i=1}^n \hat{\epsilon_i^2}$, where $H=X(X^T X)^{-1}X^T$, are still unbiased estimators.

I would really appreciate any help.

1

There are 1 best solutions below

0
On

$$ \mathbb{E}[\hat{\beta}|X]= \mathbb{E}[(X'X)^{-1}X'y|X] = (X'X)^{-1}X'(X\beta + \mathbb{E}[\epsilon|X])= (X'X)^{-1}(X'X)\beta = \beta. $$

$$ \frac{1}{\sigma ^ 2}\sum_{i=1}^n \hat{\epsilon}^2_i \sim \chi^2_{n-p} $$ and $$ rank(I-H)=n-p, $$

hence, $$ \mathbb{E}[\hat{\sigma}^2|X] = \frac{\sigma^2}{n-p} \mathbb{E}[ \chi^2_{n-p} |X] = \frac{\sigma^2(n-p)}{n-p} = \sigma^2. $$