A random sample of 3 observations was obtained from this population. Consider the following estimators of $B$ :
$\hat{B}_1 = (Y_1 + Y_2 + Y_3)/3$ and $\hat{B}_2 = Y_1/6 + Y_2/3 + Y_3/2$. Is $\hat{B}_1$ an unbiased estimator of $B$? What about $\hat{B}_2$?
and the answer to that is: \begin{align*} E(\hat{B}_1)&=E((Y_1/3)+(Y_2/3)+(Y_3/3)\\ &= (1/3)E(Y_1)+(1/3)E(Y_2)+(1/3)E(Y_3)\\ &= (1/3)B + (1/3)B+ (1/3)B = B \end{align*} Thus, $\hat{B}_1$ is an unbiased estimator.
my question then is, what sort of property is applying here? I am not getting anything of what is being done ...if they gave me a let's say coeficient, and then a sample I could run the OLS on the sample getting the coeficient and then compare them both...but what the hell..whats going on here??
Suppose that we are sampling from a population, say of people. For each person in our sample, we measure some feature of interest, say blood cholesterol level. We are interested in estimating the mean population blood cholesterol level. Denote that mean population level by $\beta$. The purpose of our experiment is to estimate $\beta$.
Suppose we do the sampling $3$ times (that's a very small sample). Let random variables $Y_1, Y_2,Y_3$ be the results, that is, the blood cholesterol levels of the $3$ people.
Each act of sampling can be used to find information about the value of the population mean $\beta$.
An estimator for $\beta$ is a random variable, based on sampling, designed to estimate some particular parameter, in this case $\beta$.
There are many ways one can imagine "combining" the random variables $Y_1,Y_2,Y_3$ to estimate $\beta$. For example, one could use $\sqrt[3]{Y_1Y_2Y_3}$, or many other things, reasonable or not.
The question asks about two specific estimators, both of the shape $a_1Y_1+a_2Y_2+a_3Y_3$, where the $a_i$ are constants.
The fundamental fact to be used is the linearity of expectation. In our general situation, this says $$E(a_1Y_1+a_2Y_2+a_3Y_3)=a_1E(Y_1)+a_2E(Y_2)+a_3E(Y_3).$$
The estimator $B_1$: We have $a_i=\frac{1}{3}$ for all $i$. Recall that $E(Y_i)=\beta$. So by the linearity of expectation, we have $$E(B_1)=E\left(\frac{1}{3}Y_1+\frac{1}{3}Y_2+\frac{1}{3}Y_3\right)=\frac{1}{3}\beta+\frac{1}{3}\beta+\frac{1}{3}\beta=\beta.$$ Thus if we use $B_1$ to estimate $\gamma$, on average we will get the right value. Thus $B_1$ is an unbiased estimator of $\beta$.
The estimator $B_2$: This is very similar to $B_1$, with $a_1=\frac{1}{6}$, $a_2=\frac{1}{3}$, and $a_3=\frac{1}{2}$. By the linearity of expectation, we have $E(B_2)=\frac{1}{6}\beta+\frac{1}{3}\beta+\frac{1}{2}\beta=\beta$.
Thus $B_2$ is also an unbiased estimator of $\beta$.
What the two random variables $B_1$ and $B_2$ have in common is that the "weights" $a_1,a_2,a_3$ in each case have sum equal to $1$. In any such case we will have $E(a_1Y_1+a_2Y_2)=a_1\beta+a_2\beta+a_3\beta=(a_1+a_2+a_3)\beta=\beta$.
By the way, $B_1$ and $B_2$, though both unbiased, are not equally good as estimators. It turns out that $B_1$ has lower variance than $B_2$, and when we are comparing unbiased estimators, lower variance is good.