General strategy to proof (or disproof) existence of a Maximum Likelihood Estimator

58 Views Asked by At

most of the questions and topics I found about the MLE on this site focus on concrete examples, where mostly the standard strategy of maximizing via differentiating was the way to go.

My situation is the following: Assume we have a sample $X_1, \ldots, X_n$ generated by distributions, which depend on two Parameters $\theta = (\theta_1, \theta_2) \in {\mathbb{R}^+}^2,$ such that our parameter space is open and not compact. Furthermore assume, that the log-Likelihood function is differentiable regarding the parameters but not in a nice and clean way (so we can't solve explicitely for $\theta_1, \theta_2$).

What is your strategy (theorems at hand etc.) for proving existence or non-existence of the MLE in these "rougher" situations?

Concrete example: Let our $X_i$ be identically and idepently distributed with a CDF $$F_\theta(x) := \begin{cases} 0, & \text{for } x \leq 0 \\ 1- \exp(- \frac{\theta_1}{\theta_2}x^{\theta_2}), & \text{for } 0 < x < 1 \\ 1- \exp(- \frac{\theta_1}{\theta_2} \cdot (x^{\theta_2} - (x-1)^{\theta_2})), & \text{for } 1 \leq x \end{cases}, $$ where $\theta \in {\mathbb{R}^+}^2.$ How to start proving the existence (or non existence) of the MLE?

Kind regards, fixfoxi

1

There are 1 best solutions below

0
On

If the likelihood isn’t differentiable use momentum gradient descent or another method for finding relative maxima.

If the likelihood is differentiable then use a numerical method for finding zeros such as Newton’s method and check the Hessian is negative definite.