calculating mean squared error for the Mean.

590 Views Asked by At

Exam Question There are two independent random variables $X_{1}$ $\&$ $X_{2}$ that are having normal distribution with mean $\mu$. Further Var$(X_{1})=1$ and Var$(X_{2})=2$.an unbiased estimator for the real mean $\mu$ is given as :

$T=\alpha X_{1}+(1-\alpha) X_{2}$ .

We must yet to chose $\alpha \in [0,1]$ . For what value of $\alpha$ is the MSE(Mean Squared Error ) of estimator $T$ for $\mu$ smallest?

Answer given to the Question

Because T is unbiased it holds that MSE$(T;\mu)=Var(T)$ and with the fact that variables are independent it follows Var$(T)= \alpha^2 +2(1-\alpha)=3\alpha^2 - 4\alpha+2$. this is minimal for $\alpha=2/3$.

My Question how does $\alpha$ ended up being $2/3$? what should I consider to calculate $\alpha$ to minimize MSE of an estimator?

1

There are 1 best solutions below

2
On

$(3\alpha^2-4\alpha+2)'=0\iff \alpha=2/3$.