I have the following model:
$Y_1=\beta+\varepsilon_1+\varepsilon_2$
$Y_2=\beta+\varepsilon_3+\varepsilon_4$
$Y_3=\beta+\varepsilon_1+\varepsilon_4+\varepsilon_5$
$Y_4=\beta+\varepsilon_2+\varepsilon_3+\varepsilon_5$
$\varepsilon_i\thicksim \text{iid } \mathcal{N}(0,\sigma^2), \forall i$
I would like to obtain the best (unbiased and with minimum variance) estimator of $\beta$. That is, I would like to know $\hat{\beta}=f(Y_1,Y_2,Y_3,Y_4)$. How should I obtain it?
I will really appreciate your help.
A general result which should be in your lecture notes says that $\hat\beta$ is an unbiased affine transform of the vector $(Y_k)_k$. Unbiasedness for every $\beta$ further imposes that $\hat\beta$ is linear and the coefficients of $\beta$ in $(Y_k)_k$ impose that $\hat\beta=\sum\limits_{k=1}^4x_kY_k$ for some $(x_k)_k$ such that $\sum\limits_{k=1}^4x_k=1$.
At this point, Lagrange multiplier's method readily yields $(x_k)_k$ but, in the present case, symmetry considerations offer a nice alternative proof.
To see this, note that the symmetry $\varepsilon_1\leftrightarrow\varepsilon_3$, $\varepsilon_2\leftrightarrow\varepsilon_4$, exchanges $Y_1$ and $Y_2$ and exchanges $Y_3$ and $Y_4$. Since the distribution of $(Y_k)_k$ is invariant by this operation, this yields $x_1=x_2$ and $x_3=x_4$. Hence $x_1=x_2=\frac12(1-x)$ and $x_3=x_4=\frac12x$ for some $x$.
The variance of $\hat\beta$ is $\sigma^2$ times $(x_1+x_3)^2+(x_1+x_4)^2+(x_2+x_4)^2+(x_2+x_3)^2+(x_3+x_4)^2$ and, when $(x_k)_k$ is as above, this sum is $\frac14+\frac14+\frac14+\frac14+x^2$, which is minimum for $x=0$.
Finally, $\hat\beta=\frac12(Y_1+Y_2)$.