Take the density of a generalized student-t, i.e., \begin{align*} p( y_t | \sigma , \mu, \nu ) = \frac {\Gamma (\frac {\nu +1}{2})}{\Gamma (\frac {\nu }{2}){\sqrt {\pi \nu }}{\sigma }\,}\left(1+{\frac {1}{\nu }}\left({\frac {y_t- \mu }{\sigma }}\right)^{2}\right)^{-\frac {\nu +1}{2}} \end{align*} and let the parameter we want to find a maximum likelihood estimator of be the scale $\sigma > 0$.
Then \begin{align*} \nabla \log p( y_t | \sigma , \mu, \nu ) = - \frac{1}{\sigma} + (\nu +1) \left( \frac{(y_t - \mu)^2 }{\nu \sigma^3} \bigg/ 1 + \frac{1}{\nu} \left( \frac{y_t- \mu}{\sigma } \right)^2 \right) \end{align*}
and the critical points can't be expressed in a closed-form. The EM algorithm is usually used to obtain an estimator.
Anyhow my question is: how do we know, in this case, that the maximum likelihood estimator exists and is unique? Is there a simple way to show this?