How to prove minimax with normal distribution.

341 Views Asked by At

Let $X$ be a sample from $N_p(\theta,I_p)$ with unknown $\theta\in \mathcal{R}^p$.

Consider the estimation of $\theta$ under the loss function $$L(\theta,T)\,=\,\|T-\mathcal\theta\|^2\,=\,\sum_{i=1}^{p}(T_i-\mathcal{\theta_i} )^2,$$ with independent priors for $\theta_i$'s.

How to show that $X$ is minimax of $\theta$?

My thought is:

${{R}_{X}}(\theta )=E[L(\theta ,a)]=E[\sum\limits_{i=1}^{p}{{{({{\theta }_{i}}-{{X}_{i}})}^{2}}}]=\sum\limits_{i=1}^{p}{E(\theta _{i}^{2}-2{{\theta }_{i}}{{X}_{i}}+X_{i}^{2})}=p$. $E(\mathbf{\theta_i}|X_i)$ is $\psi-$bayes rule w.r.t.$\Pi$, with $\psi$ being the class of rules $T(X)$ satisfying $E[T(X)]^2\le\infty$ for any $\theta$. Since $E[E(\mathbf{\theta}_i|X_i)]=\theta_i=E(X_i)$, $X_i$ is bayes estimator with constant risk. So $X_i$ is minimax, hence$X$ is minimax.

But I don't know if It's correct or not, eg whether we can derive that $X$ is Bayes estimator?.