MLE for normal distribution

336 Views Asked by At

Suppose $\{X_i\}_{i=1}^n$ is a collection of iid random variables, $X_i \sim \mathcal{N}(\theta,1).$ Prove that with no restrictions on $\theta$, the maximum likelihood estimator of $\theta$ is $\overline{X} = \frac{1}{n} \sum_{i=1}^n X_i$.

I've got most of it: $$\mathcal{L}(\theta|x) = (2\pi)^{-n/2} \exp\left\{-\frac{1}{2} \sum_{i=1}^n (x_i-\theta)^2\right\}$$ and thus \begin{align} \frac{\partial L}{\partial \theta} = 0 &\implies (2\pi)^{-n/2}\exp\left\{\frac{1}{2} \sum_{i=1}^n (x_i-\theta)^2\right\}\sum_{i=1}^n (x_i-\theta) = 0\\ &\implies \sum_{i=1}^n (x_i - \theta) = 0 \\ &\implies \theta^* = \frac{1}{n}\sum_{i=1}^n x_i. \end{align} I'm trying to confirm that this is a maximum and not a minimum. We have $$\frac{\partial^2 L}{\partial \theta^2} = (2\pi)^{-n/2}\exp\left\{\frac{1}{2} \sum_{i=1}^n (x_i-\theta)^2\right\}\left[\left(\sum_{i=1}^n (x_i-\theta)\right)^2-n\right].$$ Is this computation correct? Now how can I show that$\left[\sum_{i=1}^n (x_i-\theta)\right]^2-n< 0$ so that we assure $\theta^*$ is the maximum and not a minimum/inflection point.

1

There are 1 best solutions below

0
On BEST ANSWER

The second derivative needs to be less than zero at $\theta^*,$ so the first term of the prefactor is zero and what’s left, $-n$, is clearly negative.

On a side note, it’s easier here, and a good idea in a lot of cases, to maximize the log of the likelihood rather than the likelihood.