Consistency for maximum likelihood estimator with a single sample

168 Views Asked by At

Suppose you have a finite family of probability measures $\{\mu_\theta: \theta \in S\}$ on a finite space $\Omega$ (with respect to the discrete sigma algebra). Let $X$ be a random element of $\Omega$ distributed according to $\mu_{\theta_0}$, for some fixed $\theta_0 \in \Omega$. Consider the maximum likelihood estimator for $\theta_0$:

$\hat{\theta}_{MLE} = \arg\max_\theta L(\theta | X)$.

Here $L$ is the usual likelihood,

$L(\theta | X) = \mu_\theta(X)$.

I'm curious about consistency for the MLE in this simple setup, but not in the usual sense of 'asymptotic consistency:' rather, I want to know if there are some conditions under which the MLE is consistent for a single sample, i.e.

$\arg \max_\theta \mathbb{P}(\hat{\theta}_{MLE} = \theta) = \theta_0$.

(Sorry if my notation is a bit weird -- let me know if it's confusing and I'll try to clarify.) I am guessing this can fail in general -- is there a simple example? Are there some conditions under which it holds?

It is true that the expected value of $L(\theta|X)$ is maximized at $\theta = \theta_0$. Indeed, by Cauchy-Schwarz:

$\mathbb{E}L(\theta | X) = \mathbb{E}\mu_\theta(X) = \sum_\omega \mu_\theta(\omega) \mu_{\theta_0}(\omega) \leq \sqrt{\sum_\omega \mu_\theta(\omega)^2 \sum_\omega \mu_{\theta_0}(\omega)^2}$

Equality occurs when $\theta = \theta_0$. But this doesn't imply that the density of the MLE is maximized there.

Edit 1: To be careful, the conclusion should be that $\theta_0$ is a point that maximizes the density of the MLE (not necessarily the only point).

1

There are 1 best solutions below

0
On

Consider $\theta$ taking 3 possible values (0,1,2) and $X$ taking 2 possible values (0,1). We take: $\mu_0 = (1/2,1/2)$, $\mu_1 = (0,1)$, $\mu_2 = (1,0)$.

We get that:

$$\hat\theta_{MLE}(X) = \cases{1 \quad\text{if}\quad X=0 \\ 2 \quad\text{if} \quad X=1\\}$$

and so:

$$\mathbb{P}(\hat\theta_{MLE} = 0) = 0$$

So you get your counterexample.