limiting the log-likelihood function for Weibull distribution

132 Views Asked by At

Let $X_{1}, \cdots, X_{n}$ be random samples from the Weibull distribution $\operatorname{Weibull}(\alpha,1), \alpha >0$. Prove that the solution of the likelihood equation $\dot{l}(\alpha)=0$ is the MLE for $\alpha$.

Here is my attempt:

Since $f(x;\alpha)=\alpha x^{\alpha-1}\exp{(-x^{\alpha})}I_{x>0}$ , $$l(\alpha)=n\log{\alpha}+(\alpha-1)\sum_{i=1}^{n}{\log{x_{i}}}-\sum_{i=1}^{n}{x_i^\alpha}$$ $$\dot{l}(\alpha)=\frac{n}{\alpha}+\sum_{i=1}^{n}{\log{x_{i}}}-\sum_{i=1}^{n}{x_i^\alpha \log{x_{i}}}$$ $$\ddot{l}(\alpha)=-\frac{n}{\alpha ^2}-\sum_{i=1}^{n}{x_i^\alpha( \log{x_{i}})^2}<0$$

Therefore, $l(\alpha)$ is a concave function.

Meanwhile, $$\lim_{\alpha→+\infty}{l(\alpha)}=-\infty$$ $$\lim_{\alpha→+0}{l(\alpha)}=-\infty$$

Therefore, there exists a unique solution to the equation $\dot{l}(\alpha)=0$, which is the MLE for $\alpha$.

I know that I can solve the problem in this way, but I'm struggling with proving that $$\lim_{\alpha→+\infty}{l(\alpha)}=-\infty$$ .

2

There are 2 best solutions below

0
On BEST ANSWER

For the sake of convenience, let $$\xi = \sum_{i=1}^n \log x_i = \log \prod_{i=1}^n x_i, \quad g(\alpha) = \sum_{i=1}^n x_i^\alpha; \tag{1}$$ then $$\ell(\alpha) = n \log \alpha + (\alpha - 1) \xi - g(\alpha). \tag{2}$$

Now by the AM-GM inequality (since $x_1, \ldots, x_n > 0$), $$g(\alpha) \ge n \left(\prod_{i=1}^n x_i^\alpha\right)^{1/n} = n e^{\xi \alpha/n}. \tag{3}$$ So $$\ell(\alpha) \le n \log \alpha + (\alpha-1)\xi - n e^{\xi \alpha/n}. \tag{4}$$ We easily see that for $n > 0$, $\ell(\alpha) \to -\infty$ as $\alpha \to 0^+$ irrespective of $\xi$.

If $\xi > 0$, then the last term in $(4)$ dominates, and $\ell(\alpha) \to -\infty$ as $\alpha \to \infty$. If $\xi < 0$, then the second term dominates and again the log-likelihood approaches $-\infty$. The only case in which the inequality $(4)$ is inconclusive is if $\xi = 0$. So we return to $(2)$ and compute

$$\ell(\alpha \mid \xi = 0) = n \log \alpha - g(\alpha). \tag{5}$$ Let $x_{(n)} = \max_i x_i$ be the maximum order statistic. If $x_{(n)} > 1$, then $$g(\alpha) > x_{(n)}^\alpha \to \infty$$ as $\alpha \to \infty$, and this term dominates in $(5)$. The case $x_{(n)} < 1$ is impossible because $\xi = 0$ implies $e^\xi = \prod_{i=1}^n x_i = 1$ hence there must be at least one $x_i$ whose value is at least $1$. The final case, $\xi = 0$ and $x_{(n)} = 1$, implies $x_i = 1$ for all $i$, thus $g(\alpha) = n$ and $$\ell(\alpha \mid \xi = 0, g = n) = n (-1 + \log \alpha), \tag{6}$$ which does not have a global maximum since in this situation $\ell'(\alpha) = n/\alpha$ which has no critical point. However, such a sample occurs with probability $0$.

6
On

I believe that the issue is about the sufficiency of the solution to the first order condition with respect to the global optimization problem.

As you showed that the log-likelihood function is strictly concave, any solution for first order condition, that is $\alpha: \dot l(\alpha)=0$ is indeed a strict (unique) global maximum. I don't see what more you need to prove. This is a well known result from optimization theory.

But to actually find this optimal $\alpha$, you will need a numerical method, as closed form solutions are not available in this case.