For the model $Y=X\beta$ wher have estimators $beta_0$ and $\beta_1$ that are linear functions of $y_i$, such that:
$\beta_0=\sum c_i y_i$ and $\beta_1=\sum d_i y_i$
$c_i=\frac{1}{n}- \frac{x_i-\bar x}{S_{xx}}\bar x$ and $d_i=\frac{x_i-\bar x}{S_{xx}}$
Now I want to find $\mathrm{cov}(\sum c_i y_i,\sum d_i y_i)$
This should be $c_id_i\sum \mathrm{var}Y_i$
I tried doing it together without doing a 1-case, whithout success.
I have $$\mathrm{cov}(\sum c_i y_i,\sum d_i y_i)=E(\sum c_i y_i \times \sum d_i y_i)-E(\sum c_i y_i)E(\sum d_i y_i)$$ which I don't know how to calculate
But for a single observation I have:
$$\mathrm{cov}(c_i y_i, d_i y_i)=E(c_id_i y_i^2)-c_id_iE(y_i)^2=c_id_i \mathrm{var}Y_i$$ , then I can sum over $i$
Can somebody tell me if I can do the whole summ in 1 go?
Hints: I hope all you need.
The fundamental rule you need here is that $Cov$ is linear in both its arguments. Particularly, $Cov(aX, bY) = ab\,Cov(X,Y).$ Then use $Cov(X,X) = Var(X).$
To evaluate the product of sums, you also need to use the independence of $X_i$ and $X_j$ for $i \ne j.$ Hence, $Cov(X_i, X_j) = 0.$ And all of the 'cross-products' disappear.