MLEs are pretty useful for estimating parameters of probability distributions when they are consistent and asymptotically normal. But I suck too much at math and proofs to prove that the relevant regularity conditions hold. For example, one condition for consistency is that the parameter space be compact. The mean and variance for a normal distribution are not even bounded or closed, yet you can still use MLE for the normal distribution. Is there a comprehensive list of all the distributions for which MLEs have been proven to be consistent and asymptotically normal, just for the dummies like me? I can't for the life of me find one.
2026-03-30 14:40:19.1774881619
On
Which probability distributions are the MLEs known to be consistent and asymptotically normal?
633 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
2
On
I think MLEs are always consistent and asymptotically normal in all finite dimensional exponential families in the iid sampling regime, when the true parameter lies in the interior of the natural parameter space. This will cover many examples in the textbook (including the Gaussian location and scale family) but not (say) the Cauchy location problem, even though consistency and asymptotic normality hold there, too.
Often conditions (such as the compactness condition you complain about) are framed for mathematical convenience or for expositional convenience and not because they exhaust the range of what's actually true.
Fundamentals of Statistical Signal Processing-Estimation Theory by Steven M. Kay:
Page $167$ Theorem $7.1$: Maximum likelihood estimators satisfying some regularity conditions are asymptotically normally distributed,
$$\hat\theta\sim\mathcal{N}(\theta,I^{-1}(\theta))$$
asymptotically unbiased and they attain the CRLB (Cramer Rao Lower Bound).
Hence, MLEs are asymptotically efficient and optimal. Since they are unbiased asymptotically, they are also consistent.
Regularity conditions:
$1.$ The derivatives of the log-likelihood function exist
$2.$ Fisher information is non-zero
You can also find the same information, at this WIKI article (see the section "Properties")
Compactness argument is necessary, if not you may end up with not finding the MLE since it does not exist. Yes for example $\sigma\in\mathbb{R}^+$, where $\mathbb{R}^+$ is not compact w.r.t. standard topology. But in advance we know that $\sigma<\infty$ and that as $n\to\infty$, $\theta\to\infty$ being the maximizer is impossible, in advance.