I need to find if the estimator $\tilde{\beta } _{2} = \frac{(y_{n}-y_{1})}{(x_{n}-x_{1})} $is unbiased given that i) $E(u_{i}\mid x)=0$ ii) $E(u_{i}\mid x_{i})=0$? I also need to calculate its variance assuming $E(u_{i}\mid x)=0$, $Cov(u_{i},u_{j})=0$ for $i\neq j$ and $E(u_{i}^{2}\mid x)=\sigma ^{2}$?
The starting equation for this is $y_{i}=\beta _{1}+\beta _{2}x_{i}+u_{i}$.
This is my first question and I've tried to do it in the right format, but apologies if it's not!
I am dropping to conditioning on $x$ in what follows to simplify notation, which is what allows you to pull them out of the expectation and variances as if they were constants. The $\beta$s are constants. The $u$s are the two random numbers.
This uses the linearity of the expectations operator and the fact that $u$s are zero in expectation and independent. You know that $$\begin{aligned} \mathbb E(\tilde \beta) & =\mathbb E \left( \frac{y_n-y_1}{x_n-x_1}\right)\\ &=\mathbb E \left( \frac{\beta_1+\beta_2x_n+u_n-\beta_1-\beta_2x_1-u_1}{x_n-x_1}\right)\\ &=\mathbb E \left( \frac{\beta_2(x_n-x_1)+(u_n-u_1)}{x_n-x_1}\right)\\ &=\beta_2+\frac{\mathbb E \left( u_n)-\mathbb E(u_1 \right)}{x_n-x_1}\\ &=\beta_2 + \frac{0-0}{x_n-x_1} \end{aligned}$$
You can do something similar with the variance and covariance formulas, using the fact that the variance of constant is zero, as is its covariance with a random number, and that the $u$s don't move together:
$$\begin{aligned} \mathbb {Var}(\tilde \beta) &= \mathbb {Var}\left( \beta_2+\frac{u_n - u_1}{x_n-x_1} \right) \\ &=\mathbb {Var}(\beta_2)+\mathbb {Var} \left( \frac{u_n - u_1}{x_n-x_1} \right)+ 2 \cdot \mathbb {Cov} \left(\beta_2,\frac{u_n - u_1}{x_n-x_1}\right) \\ &=0 + \frac{\mathbb {Var}(u_n)+\mathbb {Var} (u_1) - 2 \cdot \mathbb {Cov}(u_n,u_1)}{(x_n-x_1)^2}+0 \\ &=\frac{2\sigma^2}{(x_n-x_1)^2} \end{aligned}$$