Parameter Bias of an OLS Estimation

47 Views Asked by At

True Model:

$$Y = \alpha + \beta \bf{X} + \gamma W + V$$

Model to be regressed on:

$$Y = \alpha + \beta \bf{X} + U $$

Where:

$$U + \gamma \bf{W} + V$$

So, in this model, if Cov(X,W) $\neq 0$ then $\hat{\beta}$ will be biased, assuming $\gamma \neq 0$ too.

$$\hat{\beta} = \frac{\sum_{i=1}^{n}\left[(X_{i}-\bar{X})(Y_{i}-\bar{Y})\right]}{\sum_{i=1}^{n}\left[(X_{i}-\bar{X})^2\right]}$$

How can I compute what the bias is? I believe I should analyze:

$$\mathbb{E}[\hat{\beta}] - \beta = \mathbb{E}\left[ \frac{\sum_{i=1}^{n}\left[(X_{i}-\bar{X})(Y_{i}-\bar{Y})\right]}{\sum_{i=1}^{n}\left[(X_{i}-\bar{X})^2\right]} \right] -\beta $$

However, I also know plim($\hat{\beta}) -\beta$ = $\gamma \frac{Cov(X,W)}{Var(X)}$

So, is $\gamma \frac{Cov(X,W)}{Var(X)}$ the answer for asymptotic bias? If not, what approach should I be taking to evaluate $\mathbb{E}[\hat{\beta}] - \beta$?

Thank you; and nuanced corrections to my notation and reasoning are appreciated.