A simple linear regression is defined as follow: $$y_i=\alpha+\beta x_i+\epsilon_i \qquad i=1,...,n$$
An inefficient way of estimating $\beta$ is defined as follow: $$\tilde{\beta}=\frac{\overline{y}''-\overline{y}'}{\overline{x}''-\overline{x}'}$$ where $\overline{y}'$ is the average of the first three observations, while $\overline{y}''$ is the average of the last three observations. The same logic is applied to $\overline{x}'$ and $\overline{x}''$.
I think I can write them in this way (but I'm not really sure if this can simplify the resolution of the problem): $$\overline{y}'=\frac{1}{3}\sum_{i=1}^3 y_i$$ $$\overline{y}''=\frac{1}{3}\sum_{i=n-2}^n y_i$$ $$\overline{x}'=\frac{1}{3}\sum_{i=1}^3 x_i$$ $$\overline{x}''=\frac{1}{3}\sum_{i=n-2}^n x_i$$
How can I find a way to prove that $\tilde\beta$ is an unbiased estimator of $\beta$? Or, in mathematical terms, how can I compute this: $$E[\tilde{\beta}]=\beta$$
I finally found the right solution. Since $x_i$’s are known I can do the following: $$E[\tilde\beta]=\frac{1}{\overline{x}''-\overline{x}'}E[\overline{y}''-\overline{y}']$$ Where: $$E[\overline{y}'']=\frac{1}{3}\sum\underset{=a+bx_i}{\underbrace{E[y_i]}}=\frac{1}{3}\left ( \sum a + b\sum x_i \right )=\frac{1}{3}3a+b\frac{1}{3}\sum x_i=a + b \overline{x}'’$$ $$E[\overline{y}’]=…=a + b \overline{x}’$$ So: $$E[\tilde\beta]=...=\frac{1}{\overline{x}''-\overline{x}'}\underset{=b\overline{x}''-b\overline{x}'}{\underbrace{\left ( a+b \overline{x}'' - a - \overline{x}' \right )}}=\frac{1}{\overline{x}''-\overline{x}'}b(\overline{x}''-\overline{x}')=b$$