Proving an implication in a linear regression

51 Views Asked by At

Suppose we have a linear regression: $$ y_i=X_i'\beta+u_i,\quad i=1,\ldots,T. $$ Here $y_i$ and $u_i$ are scalars, $X_i$ and $\beta$ $k\times 1$. $\beta$ is a (non-stochastic) vector of parameters. I'm given that $$ E[u_i|X_i]=0,\quad\forall i;\\ E[u_iu_j|X_i,X_j]=\sigma^2I[i=j] $$ with $I$ being the indicator function. How do I show that $$ \text{Cov}(y_i,y_j|X_i,X_j)=\sigma^2I[i=j]? $$

Attempt: $$ \text{Cov}(y_i,y_j|X_i,X_j)=\text{Cov}(X_i'\beta+u_i,X_j'\beta+u_j|X_i,X_j)\\ =E(X_i'\beta X_j'\beta+X_i'\beta u_i+X_j'\beta u_j+u_iu_j|X_i,X_j)-E(X_i'\beta+u_i|X_i,X_j)E(X_j'\beta+u_j|X_i,X_j) $$ which is simplified to $$ E(u_iu_j|X_i,X_j)-E(u_i|X_i,X_j)E(u_j|X_i,X_j) $$ so I will be done if I can show $$ E(u_i|X_i,X_j)E(u_j|X_i,X_j)=0. $$ But I can't proceed further.

1

There are 1 best solutions below

0
On

You are told that $E[u_i|X_i]=0 $ for all i.

By the law of iterated expectations, this implies that

$E[E[u_i|X_i]|X_j]=0$

but I'm fairly confident the term on the LHS is the same as $E[u_i|X_i,X_j]$ by the law of total probability...