Maximum likelihood estimator normal distribution

903 Views Asked by At

The density function of the normal distribution is given as \begin{align} f(x;\mu, \sigma^2) = \frac{1}{\sqrt{2 \pi } \sigma} e^{-\frac{1}{2 \sigma^2}x^2} \end{align} with $\mu \in \mathbb{R}$ and $\sigma >0$. The maximum likelihood estimator of $\sigma^2$ is given as \begin{align} \hat{\sigma}^2 = \frac{1}{n} \sum\limits_{i=1}^n (x_i - \overline{x}). \end{align} For $x = (x_1, \dots, x_n) = (0, \dots, 0)$ this gives $\hat{\sigma}^2 = 0$ which is not possibly due to $\sigma > 0$. So my question is how to define the maximum likelihood estimator correctly in this case. Do I state the maximum likelihood estimator for every $x \in \mathbb{R}^n / (0, \dots, 0)$ or is there a more elegant way to write this down.

2

There are 2 best solutions below

0
On

In your post, $\hat{\sigma}^2$ is an estimator of the true unknown scale parameter $\sigma^2$. As you mentioned, under the statistical model you assumed for your observations, $\hat{\sigma}^2$ is given by:

$$ \hat{\sigma}^2 = \frac{1}{n} \sum_{i=1}^{n} \left( x_i - \overline{x} \right)^2. $$

We see from this definition that $\hat{\sigma}^2 \geq 0$. But there is not guarantee that $\hat{\sigma}^2 > 0$.

Indeed, to have $\hat{\sigma}^2 > 0$, one of your sample $x_i$ must be different from the sample mean $\overline{x}$.

In your example, $\overline{x} = 0$ and: $\forall i, \; x_i = \overline{x}$. Therefore, $\hat{\sigma}^2 = 0$.

0
On

To take that approach you would need to exclude all the other cases where every value in the sample is identical, such as $(5.37, 5.37, \ldots, 5.37)$. But in all such cases the natural estimate of the variance is $0$; for example any positive estimate of the variance will lead to a lower likelihood than half that estimate, so giving $0$ in the limit

Rather than over-complicate the answer in the way you propose, the answer is to allow $\hat{\sigma}^2 = 0$ when every value in the sample is identical, and that includes the case where $n=1$