minimizing MSE of estimator $\theta(a,b) = \frac{1}{n} \sum^n_{i=1} Y_ia_i + b$

107 Views Asked by At

We have $Y_i = \theta + \epsilon$, $E\epsilon = 0$ and $Var(\epsilon) = 0.5$.

We have a proposed estimator for $\epsilon$ :

$\hat{\theta}(a,b) = \frac{1}{n} \sum^n_{i=1} Y_ia_i + b$. And $a = (a_i)^n_{i=1}$ must satisfy the constraint $\frac{1}{n}\sum^n_{i=1} a_i = 1$

We want to find $a = (a_i)^n_{i=1}$ and $b$ minimize the MSE of our estimator. Here is some of my work:

MSE = $E[(\hat{\theta} - \theta)^2]$

My guess is we want to take the derivative with respect to $b$ and $a$ and find the optimal values that way. As in, minimize what is inside of the expectation.

I have $$ (\hat{\theta} - \theta)^2 = (\frac{1}{n} \sum Y_ia_i + b - \theta )^2\\ = (\frac{1}{n}\sum Y_i a_i + \frac{1}{n}\sum b - \theta)^2 \\ = (\frac{1}{n}Y_i a_i + b - \theta)^2$$ We take the derivative with respect to b. $$2(\frac{1}{n}\sum Y_i a_i + b - \theta) $$ Setting that to zero we have $$b^* = \frac{n\theta}{2*\sum Y_ia_i}$$

Before I move on to follow the same steps with $a$ I am wondering whether this is the correct approach since I am having a hard time interpreting having an optimal value for variables in our estimator having $\theta$ in them.

1

There are 1 best solutions below

2
On BEST ANSWER

You have mistakes in your simplification and you need to take the expectation $E[(\hat\theta-\theta)^2]$ before differentiating. \begin{align*} (\hat\theta-\theta)^2&=\Big(\frac{1}{n}\sum_ia_iY_i+b-\theta\Big)^2\\ &=\Big(\frac{1}{n}\sum_ia_i(Y_i-\theta)+b\Big)^2\\ &=\frac{1}{n^2}\Big[\sum_ia_i(Y_i-\theta)\Big]^2+b^2+\frac{2}{n}\Big[\sum_ia_i(Y_i-\theta)\Big]b. \end{align*} Note \begin{align*} E\Big[\sum_ia_i(Y_i-\theta)\Big]&=\sum_ia_iE(Y_i-\theta)=0,\\ E\Big[\Big(\sum_ia_i(Y_i-\theta)\Big)^2\Big]&=\operatorname{Var}\Big[\sum_ia_i(Y_i-\theta)\Big]=\sum_ia_i^2\operatorname{Var}(Y_i-\theta)=\frac{1}{2}\sum_ia_i^2. \end{align*} Then: $$ E[(\hat\theta-\theta)^2]=\frac{1}{2n^2}\sum_ia_i^2+b^2. $$ It remains to use Cauchy-Schwarz $$ n\sum_ia_i^2=\sum_i1^2\sum_ia_i^2\geq\Big(\sum_i1\cdot a_i\Big)^2=n^2\implies\sum_ia_i^2\geq n $$ and the observation $b^2\geq 0$ to find the optimal $a_i=1$ $\forall i$ and $b=0$.