Let $Y_1,Y_2,Y_3$ and $Y_4$ be four random variables such that $E(Y_1)=\theta_1-\theta_3;\space\space E(Y_2)=\theta_1+\theta_2-\theta_3;\space\space E(Y_3)=\theta_1-\theta_3;\space\space E(Y_4)=\theta_1-\theta_2-\theta_3$, where $\theta_1,\theta_2,\theta_3$ are unknown parameters. Also assume that $Var(Y_i)=\sigma^2$, $i=1,2,3,4.$ Then which one is true?
A. $\theta_1,\theta_2,\theta_3$ are estimable.
B. $\theta_1+\theta_3$ is estimable.
C. $\theta_1-\theta_3$ is estimable and $\dfrac{1}{2}(Y_1+Y_3)$ is the best linear unbiased estimate of $\theta_1-\theta_3$.
D. $\theta_2$ is estimable.
The answer is given is C which looks strange to me (because I got D).
Why I got D? Since, $E(Y_2-Y_4)=2\theta_2$.
Why I don't understand that C could be an answer? Ok, I can see, $\dfrac{Y_1+Y_2+Y_3+Y_4}{4}$ is an unbiased estimator of $\theta_1-\theta_3$, and its' variance is less than $\dfrac{Y_1+Y_3}{2}$.
Please tell me where am I doing wrong. Any help appreciated. Thanks!
Also posted here: https://stats.stackexchange.com/questions/319117/a-problem-on-estimability-of-parameters
I recently reviewed the question and their is nothing written about $Y_i$'s being uncorrelated. So, please assume that $Y_i$'s are uncorrelated. Please!