I'm having a crisis of faith here, I'm trying to prove that $\beta_0$is unbiased.
The formula for $\beta_0$(the parameter) is: $$\beta_0=\mu_Y-\beta_1\mu_X$$ The formula for $\hat \beta_0$(the estimator) is: $$\hat \beta_0=\hat Y-\hat \beta_1 X$$
Which can be rewritten as:
$$\hat \beta_0=\bar Y-\beta_1\bar X$$
Thus: $$E(\hat \beta_0)=E(\bar Y)-E(\hat \beta_1\bar X)$$ $$=\mu_Y-E(\hat \beta_1\bar X)$$ $$=\beta_0+\beta_1\mu_X-E(\hat \beta_1 \bar X)$$
Now, it's easy to see that if: $$cov(\hat \beta_1, \bar X)=0$$ then: $$E(\hat \beta_1\bar X)=E(\hat \beta_1)E(\bar X)$$ $$=E(\hat \beta_1)\mu_X$$ Thus:
$$E(\hat \beta_0)=\beta_0+\beta_1\mu_X-E(\hat \beta_1) \mu_X$$
and given that $E(\hat \beta_1)=\beta_1$ (which I know how to prove):
$$E(\hat \beta_0)=\beta_0$$
Yet how can $cov(\hat \beta_1, \bar X)$ be equal to zero? $\hat \beta_1$ changes when $\bar X$ changes, these terms must be related by definition?
I think if you consider about the estimator, say $\hat{\beta_1}$, $\bar{x}$ is no long a random variable but the specific given data, s.t. $$ \mathbb{E}[\hat{\beta_1} \bar{x}] = \bar{x}~ \mathbb{E}[\hat{\beta_1}] = \bar{x} ~ \beta_1$$.