Does this estimator respect the likelihood principle?

129 Views Asked by At

Exercise: Let $X_1,\ldots,X_n$ be a random sample from the distribution with density $$f(x\mid\theta) = \dfrac{2x}{\theta^2}\mathbb{1}_{(0,\theta)}(x)$$ w.r.t. the Lebesgue measure. Derive an unbiased estimator for $\theta$. Does this estimator respect the likelihood principle?

I know that:

Def: (Likelihood principle (LP)).The information brought by an observation $x$ about $\theta$ is entirely contained in the likelihood function $L(\theta;x)$. Moreover, if $x$ and $x'$ are two observations depending on the same parameter (possibly in different experiments), such that there exists a constant $c$ satisfying $L(\theta;x) =cL'(\theta;x_0)$ for every $\theta$, they bring the same information about $\theta$ and must lead to identical inferences.

Question: Suppose I derived the unbiased estimator $\hat{\theta} = \dfrac{3}{2}X_1$; does this estimator satisfy the likelihood principle? I know that p-values do not respect the LP, because you reach different conclusions when using different p-values, but I'm not sure how an estimator would or would not respect the LP.

Thanks in advance!

1

There are 1 best solutions below

0
On

$$ L(\theta \mid X=x) = \begin{cases} \dfrac{2x}{\theta^2} & \text{for } \theta\in [x,+\infty), \\[8pt] \,\,0 & \text{for } \theta\in(0,x). \end{cases} $$ The MLE is therefore $x$ itself, since $L(\theta)$ increases as $\theta$ decreases UNTIL $\theta$ gets as small as $x.$ That is clearly a biased estimator.

Your unbiased estimator can be calculated if the MLE is known, and the MLE can be calculated if your estimator is known. Therefore, no information except what is in the likelihood function goes into your estimator.

Merely being unbiased is not enough to guarantee violation of the likelihood principle. But an insistence on always being unbiased will result in some estimators violating the likelihood principle.