Is the maximum likelihood estimator for the mean equal to the sample mean under all densities?

526 Views Asked by At

Suppose, I have a sample from an unknown distribution. I want to prove/disprove (mathematically!) the following statement:

The maximum likelihood (ML) estimator for the (unknown) population mean will always be equal to the sample mean irrespective of the likelihood function. We suppose that, the likelihood function is twice differentiable with respect to its parameters and the mean and variance is finite under the inverse likelihood function (pdf).

I believe for symmetric distributions it is straightforward. To maximize the likelihood the center of the mass should be placed at the mean. So, sample mean should be the best estimator to maximize the likelihood of the observed data. But I can not relate to it mathematically.

1

There are 1 best solutions below

6
On

Your question is a little underspecified; you need to specify a parametric family of distributions, it's not enough to just say "an unknown distribution," or else it's not clear what "maximum likelihood estimate" means.

Anyway, this is false in general. Consider, for example, the uniform distribution on $[0, \theta]$ where $\theta$ is the unknown parameter to be estimated (so $\frac{\theta}{2}$ is the true mean). It's an easy exercise to show that the maximum likelihood estimate of $\theta$ from $n$ samples $X_1, \dots X_n$ is $\text{max}(X_1, \dots X_n)$, meaning the MLE of the true mean is half this.