For what value of an unknown is an Estimator Biased

42 Views Asked by At

The application goes like this:

Let there be a random variable $X$, and $X_{1}, X_{2}, X_{3}$ random variables representing a sample of $X$ that has the same distribution as $X$ and that are independent of each other. In order to estimate the mean $m$ of the variable $X$, we use the following function: $$Y=0.6X_{1}+ 0.1X_{2}+ aX_{3}$$ with $a\in R$

What is the condition so that Y is a biased estimator of the mean $m$ of $X$ ?

I know that for an estimator $\hat{p}$ to be unbiased the condition is that: $$E(\hat{p})=p$$

But for $Y$ to be biased do we just write: $$m \neq 0.6X_{1}+ 0.1X_{2}+ aX_{3}$$

And compute:

$$a \neq \frac{ -0.6X_{1}-0.1X_{2}+m }{ X_{3} }$$

Is this the condition? I'm not really sure about this. I just got this off the web and I read that the answer is $0.3$ I'm not sure if that's true. But if it is how do we get to that?

Thanks a lot.

2

There are 2 best solutions below

0
On BEST ANSWER

The definition of unbiased should say that that equality holds for all values of $p$ in the parameter space. As $p$ changes, then so does $\operatorname E(\widehat p),$ but it should change in such a way that it remains equal to $p.$

You have \begin{align} \operatorname E(Y) & = \operatorname E(0.6X_1+ 0.1X_2+ aX_3) \\[10pt] &= 0.6\operatorname E(X_1) + 0.1\operatorname E(X_2) + a\operatorname E(X_3) \\[10pt] & = 0.6m + 0.1 m + am \\[10pt] & = (0.7+a) m \\[10pt] & = m. \end{align} Solve for $a,$ and you can assume $m\ne0,$ since the definition of unbiasedness says the equality must remain true as $m$ changes.

0
On

We have $$E(Y) = 0.6E(X_1)+0.1E(X_2)+aE(X_3) = 0.6m+0.1m+am = (0.7+a)m.$$ In order for $Y$ to be an unbiased estimator of $m$, we need $E(Y)=m$ which is only the case if $a=0.3.$ So $Y$ is biased whenever $a\ne0.3.$