Let $X_1,\dots, X_n$ have Weibull distribution with pdf $f(x) = \frac{\alpha}{\theta}x^{\alpha-1}e^{-\frac{x^\alpha}{\theta}}$. I'm trying to show that there is an unique solution for $\hat{\alpha}$, the MLE of $\alpha$.
After deriving the likelihood equations,my problem is on showing that:
$h'(\alpha) = \dfrac{\sum x_i^\alpha \, \sum x_i^\alpha (\log \, x_i)^2 -(\sum x_i^\alpha\log x_i)^2}{(\sum x_i^\alpha)^2}+\dfrac{1}{\alpha^2}>0$
so that I can argue, together with other conditions, that the root of likelihood equation is unique.
My attempt was to rewrite $h'(\alpha) $ as:
$h'(\alpha) = \sum(\log x_i)^2 \dfrac{e^{\alpha \log x_i}}{\sum e^{\alpha \log ,x_i}} - \left(\sum \log x_i \dfrac{e^{\alpha \log x_i}}{\sum e^{\alpha \log x_i}} \right)^2 + \dfrac{1}{\alpha^2}$
and use Cauchy-Schwarz on $\left(\sum \log x_i \dfrac{e^{\alpha \log x_i}}{\sum e^{\alpha \log x_i}} \right)^2$ to argue that:
$\sum (\log x_i)^2\sum\left(\dfrac{e^{\alpha\log x_i}}{\sum e^{\alpha\log x_i}}\right)^2 - \left(\sum\log x_i \dfrac{e^{\alpha\log x_i}}{\sum e^{\alpha \log x_i}}\right)^2\geq 0$
But then I could not show that $\sum(\log x_i)^2\dfrac{e^{\alpha\log x_i}}{\sum e^{\alpha\log x_i}}\geq \sum(\log x_i)^2\sum\left(\dfrac{e^{\alpha\log x_i}}{\sum e^{\alpha\log x_i}}\right)^2 $ to argue that $h'(\alpha) > 0$ (since $\frac{1}{\alpha^2}>0$, so that I don't have to worry about it).
Any suggestions on how to tackle this problem?
Obs: the book says that $h'(\alpha)>0$, but it doesn't show how, it simple states that.
Well, you have to show that $$(y_1+\dots y_n)(y_1\log(x_1)^2+\dots y_n\log(x_n)^2)\geq(y_1\log(x_1)\dots)^2$$
where $y_i=x_i^\alpha$
But notice that the LHS equals
$$((\sqrt{y_1})^2+\dots (\sqrt{y_n})^2)((\sqrt{y_1}\log(x_1))^2+\dots (\sqrt{y_n}\log(x_n))^2)$$
And now you can apply Cauchy to get $$((\sqrt{y_1})^2+\dots (\sqrt{y_n})^2)((\sqrt{y_1}\log(x_1))^2+\dots (\sqrt{y_n}\log(x_n))^2) \geq (y_1\log(x_1)\dots)^2$$
which is what you want.
This crucially depends on $x_i^\alpha\geq 0$ so that you can take its square root.