Minimax Estimator for Normal Random Vector

97 Views Asked by At

Question. Suppose $Y_i \sim N(\mu_1, 1)$. Let $Y := (Y_1, Y_2)$, and $T_y = (Y_1, 0)$. Denote $\Theta$ as the space of all estimators $\mu := (\mu_1, \mu_2)$. Is it necessarily true that $\hat{\mu}$ is minimax if

$$\Theta := \{\mu : \mu_2 = 0\} \quad\quad\text{or}\quad\quad \Theta := \{\mu : \mu_1 = 0\}.$$

I'm not quite sure how to show these statements. Starting with the leftmost, the risk function is

$$ \begin{align} R(\mu, T_Y) &= \mathbb{E}||T_Y - \mu||^2 \\ &= \mathbb{E}||(Y_1 - \mu_1, 0)^T||^2 \\ &= \mathbb{E}\left(Y_1^2 - 2Y_1\mu_1 + \mu_1^2\right) \\ &= \mathbb{E}Y_1^2 - 2\mu_1\mathbb{E}Y_1 + \mu_1^2 \\ &= 1 + \mu_1^2 - \mu_1^2 \\ &= 1. \end{align} $$

So, $\hat{\mu}$ is a minimax estimator if

$$\sup_{\mu \in \Theta}\mathbb{E}||\hat{\mu} - \mu||^2 = \inf_T\sup_{\mu \in \Theta} R(\mu, T_Y) = 1$$ Not sure how to proceed? I'm assuming the other statement will be similart to solve.