If a MVUE exists it must be MLE

1.1k Views Asked by At

I'm trying to understand this theorem: Let $\theta=(\theta_1,...\theta_p)$ be parameters for a distribution. Then if $\theta$ has a minimum variance unbiased estimator $\hat{\theta}$, then $\hat{\theta}$ must be the maximum likelihood estimator obtained by solving $U(x;\hat{\theta})=0$.

It is not clear to me why this is true, although I understand that the MLE of $\theta$ maximizes the joint density function $f(x;\theta)$.

1

There are 1 best solutions below

0
On

This is incorrect. There is no reason for MLE to be an unbiased estimator. E.g., in normal settings $N(\mu, \sigma^2)$, the MLE of $\sigma^2$ $$ \frac{1}{n}\sum(X_i - \bar{X})^2, $$ while the UMVUE is $$ \frac{1}{n-1}\sum(X_i - \bar{X})^2. $$