Proving the existence of MLE for logistic distribution

214 Views Asked by At

Let $X_{1},\cdots,X_{n}\overset{IID}{\sim}\operatorname{Logis}(\theta,\sigma),\theta\in\mathbb{R},\sigma>0$. Prove that there exists an MLE of $\eta = (\theta,\sigma)^T$.

I want to prove this by the following theorem:

For the log-likelihood function $l(\theta)$ whose second partial derivatives are all continous, if the Hessian matrix is negative-definite and limiting $\theta$ to the boundary of the parameter space makes $l(\theta)$ go to negative infinity, there exists a unique solution to $\dot{l}(\theta) = 0$, which is the MLE of $\theta$.

Since the pdf for Logistic distribution is $$f(x;\theta,\sigma)=\frac{1}{\sigma}\frac{e^{-\frac{x-\theta}{\sigma}}}{(1+e^{-\frac{x-\theta}{\sigma}})^2}$$ , the log-likelihood function is $$l(\theta,\sigma) = -n\log{\sigma} -n\frac{\bar{x}-\theta}{\sigma} -2 \sum_{i=1}^{n}{\log{(1+e^{-\frac{x_{i}-\theta}{\sigma}})}} $$ and the Hessian matrix is $\begin{pmatrix} \frac{\partial^2 l}{\partial \theta^2} & \frac{\partial^2 l}{\partial \theta \partial\sigma} \\ \frac{\partial^2 l}{\partial \sigma \partial\theta} & \frac{\partial^2 l}{\partial \sigma^2} \end{pmatrix}$ where $$ \begin{aligned} &\frac{\partial^2 l}{\partial \theta^2}=-\frac{2}{\sigma^2} \sum_{i=1}^n \frac{e^{-\frac{x_i-\theta}{\sigma}}}{\left(1+e^{-\frac{x_i-\theta}{\sigma}}\right)^2} \\ &\frac{\partial^2 l}{\partial \theta \partial \sigma} =\frac{\partial^2 l}{\partial \sigma \partial \theta} =-\frac{n}{\sigma^2}+\frac{2}{\sigma^3} \sum_{i=1}^n \frac{e^{-\frac{x_i-\theta}{\sigma}}\left(\sigma\left(1+e^{-\frac{x_i-\theta}{\sigma}}\right)-\left(x_i-\theta\right)\right)}{\left(1+e^{-\frac{x_i-\theta}{\sigma}}\right)^2} \\ &\frac{\partial^2 l}{\partial \sigma^2} =\frac{n}{\sigma^2}-\frac{2}{\sigma^3} \sum_{i=1}^n x_i+\frac{2 n \theta}{\sigma^3}+\frac{2}{\sigma^4} \sum_{i=1}^n \frac{e^{-\frac{x_i-\theta}{\sigma}}\left(x_i-\theta\right)\left(2 \sigma\left(1+e^{-\frac{x_i-\theta}{\sigma}}\right)-\left(x_i-\theta\right)\right)}{\left(1+e^{-\frac{x_i-\theta}{\sigma}}\right)^2} \end{aligned} $$ I could show that $\lim_{\theta \rightarrow \infty}{l(\theta,\sigma^*)}=\lim_{\theta \rightarrow -\infty}{l(\theta,\sigma^*)}=\lim_{\sigma \rightarrow 0+}{l(\theta^*,\sigma)}=\lim_{\sigma \rightarrow \infty}{l(\theta^*,\sigma)}=-\infty$ somehow, but I don't know how to show that the hessian matrix is negative-definite.