$X_1$ and $X_2$ are random variables that satisfy $\operatorname{E}[X_1] = \operatorname{E}[X_2] = \mu$ and $\operatorname{Var}[X_1] = \operatorname{Var}[X_2] = 1$. Specify the range for $\mu$ so that the point estimate $\hat \mu_1 = \frac{X_1 + X_2}{3} + 3$ has a lower mean squared error than the point estimate $\hat\mu_2 = \frac{X_1 + X_2}{2}$.
Hi, I was trying to calculate the $u$ that satisfies the stated condition. However,
I found $\frac{1}{2} + \frac{(u-9)^2}{3} < \frac{1}{2} + 0^2$ (By using mse(T) = var(T) + bias(t)^2)
But there is no value for u for this inequality, so I wonder if I have a mistake in some point or not.
Note under the assumption of independence, $$\operatorname{MSE}[\hat \mu_1] = \operatorname{Var}\left[\frac{X_1 + X_2}{3} + 3\right] + \operatorname{E}^2\left[\frac{X_1 + X_2}{3} + 3 - \mu\right] = \frac{2}{9} + \left(3-\frac{\mu}{3}\right)^2$$ whereas $$\operatorname{MSE}[\hat \mu_2] = \operatorname{Var}\left[\frac{X_1 + X_2}{2}\right] + \operatorname{E}^2\left[\frac{X_1 + X_2}{2} - \mu\right] = \frac{1}{4}.$$