General Gauss-Markov theorem

443 Views Asked by At

$Y=XB+u$ where $X$ is a non random $n\times k$ Matrix, $\textrm{rank}(X)=k, E(u)=0, E(uu')=\sigma^2\Omega$, How to form $(1)$ How to proof $(2)$ the general Gauss-Markov theorem?

1

There are 1 best solutions below

2
On BEST ANSWER

The Gauss-Markov Theorem states that the OLS estimator:

$$\hat{\boldsymbol{\beta}}_{OLS} = (X'X)^{-1}X'Y$$

is Best Linear Unbiased. For the proof, I will focus on conditional expectations and variance: the results extend easily to non conditional. Also, for the proof, I consider $I_{n}$ $=$ $\Omega$, but the result extends easily to the non equal case as well.

Proof:

Is it unbiased?

$$E(\hat{\boldsymbol{\beta}}_{OLS} \mid X) = E[(X'X)^{-1}X'Y \mid X] = E[(X'X)^{-1}X'(X\boldsymbol{\beta} + u) \mid X] = \\ \boldsymbol{\beta} + (X'X)^{-1}X'E(u \mid X) = \boldsymbol{\beta}$$

Yes! Is it, then, among the unbiased, that with the smallest variance? Consider, for this purpose, a general linear unbiased estimator $\boldsymbol{b}$:

$$\boldsymbol{b} = C\boldsymbol{y}$$

where $C$ is a generic $k$ $\times$ $n$ matrix that depends only on the sample information in $X$ and, given unbiasedness, such that $CX$ $=$ $I_{k}$ to guarantee unbiasedness. Note that for $\hat{\boldsymbol{\beta}}_{OLS}$, $C_{Ols}$ $=$ $(X'X)^{-1}X'$. It can be proved that:

$$Var(\hat{\boldsymbol{\beta}}_{OLS} \mid X) = \sigma^{2}(X'X)^{-1}$$

and, for the generic linear estimator:

$$Var(\boldsymbol{b} \mid X) = \sigma^{2}(C'C)^{-1}$$.

We can additionally define $D$ $=$ $C$ $-$ $C_{ols}$. It is immediate that $DX$ $=$ $0$. From it, we can finally conclude that:

\begin{align} Var(\boldsymbol{b} \mid X) &= \sigma^{2}[D - (X'X)^{-1}X'][D - X(X'X)^{-1}] \\ &= \sigma^{2}(X'X)^{-1} + \sigma^{2}DD' \\ &= Var(\hat{\boldsymbol{\beta}}_{OLS} \mid X) + \sigma^{2}DD' \\ &> Var(\hat{\boldsymbol{\beta}}_{OLS} \mid X) \end{align}

Since DD' is non negative defined.