The least squares estimator of $\beta_0$ $=$ $(Y\bar)$ $-$ $\beta_1$$(X\bar)$ can be expressed as a linear function of $Y_i$. Let $(\beta'_0)$ be another unbiased estimator of $\beta_0$, say $(\beta'_0)$ $=$ $$\sum_{i=1}^n c_iY_i $$ where $c_i$ $=$ $a_i$ $+$ $b_i$. Show that Var$(\beta_0)$ $\leq$ Var$(\beta'_0)$
Intuitively it makes sense but how should i start proving this?
Calculate the variance of $\beta'$ and compare it to the variance of the OLS estimator. From the results you will be able to establish the inequality.
From the variance definition, you would have
$$Var(\beta ')= \mathbb{E}\left[\left(\sum_{i=1}^{n}c_iY_i - \mathbb{E}(\sum_{i=1}^n c_iY_i)\right)^2\right]$$ where $\mathbb{E}$ is the expectations operator and its empirical analogue is an average. You may replace $Y_i$ by the linear model describing it (e.g. $X_i+\epsilon_i$) and compare the results to the OLS estimator.
This link migth help you: http://sfb649.wiwi.hu-berlin.de/fedc_homepage/xplore/tutorials/xegbohtmlnode14.html