Why is this condition necessary for omitted variable bias vanishing?

13 Views Asked by At

For a OLS estimation $y_i = x_i^{T} \beta + e_i$, let the regressors be partitioned as $x_i = (x_{1i}, x_{2i})$. The model can be rewritten as $y_i = x_{1i}^{T} \beta_1 + x_{2i}^{T} \beta_2 + e_i$. Suppose one regresses $y_i$ on $x_{1i}$ only and estimates the parameter to be $\beta_1'$, then by straightforward calculation one knows $\beta_1' = \beta_1 + (E(x_{1i}^T x_{1i}))^{-1} E(x_{1i}^T x_{2i})\beta_2$. In Hansen's Econometrics book, the author asserts that $(E(x_{1i}^T x_{1i}))^{-1} E(x_{1i}^T x_{2i})\beta_2 = 0$ iff $E(x_{1i}^T x_{2i}) = 0$ or $\beta_2=0$, but I cannot see why. Can anyone help?