Ordinary Least Squares: Why we need mean independent errors?

19 Views Asked by At

This is from my lecture on classic linear regression model:

$$ \text { Assumption 1: } E\left(\varepsilon \mid x_{1}, \ldots, x_{K-1}\right)=0 $$

enter image description here

Q: I am able to follow this fine until "Assumption 1 applies ... so that upon substitution". Where is that longer equation coming from? Where is he substituting the previous result of the correlated error vector to get that last lengthy equation?

1

There are 1 best solutions below

0
On BEST ANSWER

You should have a model assumption that $$y_i = \beta_0 + \beta_1 x_{i1} + \cdots + \beta_{K-1} x_{i, K-1} + \varepsilon_i.$$ Substituting in the expression for $\varepsilon_i$ yields the long equation you are asking about.