Independence between error and regressor

31 Views Asked by At

Let the following classical linear regression:

$$y_i = x_i \theta + u_i, \quad E(u_i|x_i) \sim N(0, \sigma^2)$$

Can I conclude that $x$ and $u$ are independent?

I would like this because I want to prove that: $y_i|x_i \sim N(\theta x , \sigma^2)$. And I need the independence between $x_i$ and $u_i$.to use the linearity of the variance:

$$V(y_i|x_i) = V(x_i \theta|x_i) + V(u_i|x_i) $$

Some idea?

1

There are 1 best solutions below

0
On BEST ANSWER

NO , You cannot conclude that $x_i$ and $u_i$ are independent just by by looking at an equation.

However in a classical regression setting it is assumed that $x_i$ are known values (aka constants , ie , they are not random variables). And Yes , they are assumed to be independent of $u_i$.

P.S : Just to remind you again that they are assumed to be independent. In time series context/presence of lagged variables, the $ x_i$'s may not be independent of $u_i$'s.