OLS Assumptions, consistency

174 Views Asked by At

How would I go about verifying the assumptions that the OLS is the best linear unbiased estimator(BLUE). What tests should I use to show that assumptions hold. How would I know if the estimator is consistent.

Thanks

1

There are 1 best solutions below

0
On

Regarding your last question on the consistency. The link(s) above answers the other properties (BLUE), however consistency is a little more subtle. Let's consider a simple linear model $Y_i = \beta_0 + \beta_1X_i + \epsilon_i$, with the classical assumptions on $\epsilon_i$, where $\beta_1$'s is given by $$ \hat{\beta}_1 = \frac{\sum X_iY_i-n\bar{X}\bar{Y}}{\sum X_i^2-n\bar{X}^2} = \frac{(\sum X_iY_i-n\bar{X}\bar{Y}/)n}{(\sum X_i^2-n\bar{X}^2)/n}. $$ For a bi-variate Normal vector $(X,Y)$ the $\beta_1$ as defined as $\frac{\text{cov}(X,Y)}{\sigma^2_X}$. So let's look at the nominator, which is $(\sum X_iY_i-n\bar{X}\bar{Y}/)n$. By WLLN the first summand $\frac{1}{n}\sum X_i Y_i$ converges in probability to $\mathbb{E}XY$, while the second summand converges to $\mathbb E X \mathbb E Y$. As such, the nominator converges to $\mathbb{E}XY - \mathbb E X\mathbb EY = \text{cov}(X,Y)$. Same logic we can apply to the denominator and deduce that it converges to $\text{var}(X)$. However, for the ratio to converge to $\text{cov}(X,Y)/\text{var}(X)$ you need to use the continuous mapping theorem for $g(x,y) = x-y^2$ and $\tilde{g} = (x-y^2)^{-1}$. Hence, as each of the components converges to a constant then the ratio converges to the ratio of the limiting constants, i.e., indeed $\hat{\beta}_1 \xrightarrow{p}\beta_1$, as $n\to \infty$. Same arguments can be used to show that $\hat{\beta}_0 \xrightarrow{p} \beta_0$.