Maximum Likelihood Estimation for Normal Distribution with parameters $\theta$ and $2\theta$ with $\theta>0$

469 Views Asked by At

I'm using log-likelihood and eventually I get to the following quadratic equation: $n{\theta}^{2} + 2n\theta - \sum_{i=1}^{n}X_i^2 = 0$. After solving the equation I get a solution ($\frac{-n + \sqrt{n^2 + n\sum_{i=1}^{n}X_i^2}}{n}$) that seems rather cumbersome and it's hard to check weather the given solution is indeed a maximum of the log-likelihood function. Is the solution correct and there simply isn't an easier way to do it or am I missing something here?

2

There are 2 best solutions below

3
On BEST ANSWER

You shoud inspect your log-likelihood function (which you should include in the question):

$$\ell(\theta)=-\frac{n}{2}\log (4\pi\theta) - \frac{1}{4\theta}\sum_{i=1}^{n}(X_i - \theta)^2$$

Its (natural [*]) domain is $(0,+\infty)$. The function is continuous and differentiable in all its domain. The behaviour at the extremes is: $$\lim_{\theta \to 0^+}\ell(\theta)=-\infty$$ (the second term wins) and $$\lim_{\theta \to +\infty}\ell(\theta)=-\infty$$

You've found a single critical point, inside the domain. Then, it must be a local maximum, and also a global maximum.

The fact that the critical point is a local maximum can also be seen by computing the second derivative.


[*] It should be stressed that the domain of the parameter should be part of the likelihood definition (as with any function). I'm just assuming here the "natural" one (the largest subset of the reals), but if your problem statement somehow implies a more restricted domain, the answer could differ.

0
On

$$ L(\theta) \propto \frac 1 {\sqrt\theta^n} \exp\left( \frac {-1}{2(2\theta)} \cdot \sum_{i=1}^n(x_i-\theta)^2 \right) $$ Let $\overline x = \frac 1 n \sum_{i=1}^n x_i$ and $s^2 = \frac 1 n \sum_{i=1}^n (x_i-\overline x)^2.$ Then algebra tells us that $$ \sum_{i=1}^n (x_i-\theta)^2 = ns^2 + n(\overline x - \theta)^2. $$ Hence $$ L(\theta) \propto \frac 1 {\sqrt\theta^n} \exp\left( \frac{-1}{4\theta} \left( ns^2 + n(\overline x -\theta)^2 \right) \right). $$ \begin{align} \ell(\theta) & = \log L(\theta) = \text{constant} - \frac n 2 \log\theta - \frac {ns^2} {4\theta} - \frac{n(\overline x-\theta)^2}{4\theta} \\[10pt] & = \text{constant} - \frac n 2 \log\theta - \frac{ns^2}{4\theta} - \frac{n\overline x^2}{4\theta} + \text{constant} - \frac{n\theta} 4 \\[10pt] \ell\,'(\theta) & = - \frac n {2\theta} + \frac{ns^2}{4\theta^2} +\frac{n\overline x^2}{4\theta^2} - \frac n 4 \\[10pt] & = \frac {-n} {4\theta^2} \left( 2\theta - s^2 - \overline x^2 + \theta^2 \right). \\[10pt] & = (\text{negative constant}) \cdot \left( (\theta+1)^2 -s^2-\overline x^2 - 1 \right) \end{align} Since $\theta$ cannot be negative, the above expression is $0$ only if $$ \theta = -1 +\sqrt{s^2+\overline x^2 + 1}. $$