Uniformly minimum variance unbiased estimator

777 Views Asked by At

How to prove $ \overline{X}=\frac{1}{n}\sum_{i=1}^nX_i$ is the uniformly minimum variance unbiased estimator of $\mu$ when $X_i\sim N(\mu,\sigma^2),$ and $\sigma$ is known.

Idea: Let $X=(X_1,X_2,...,X_n)$, then we need to prove $E(\overline{X}-\mu)^2\leq E(f(X)-\mu)^2$ for any $f(X)$, an unbiased estimator of $\mu.$ Since $\overline{X}$ is sufficient and complete statistics, then by Lehmann-Scheffe theorem, we can easily get $\overline{X}$ is uniformly minimum variance unbiased estimator of $\mu$. Can someone directly prove this statement without applying the theorem?