An exercise about estimators

115 Views Asked by At

I found the exercise in a statistic book and it seems easy but i have problems to resolve it.

We are asked to find the uniformly centered estimator of minimum variance for θ based on the maximum likelihood estimator and check if it is efficient considering a simple random sample of size n from a population with density :

$f_θ(x) = (θ + 2)x^{θ+1}, \phantom{3} x ∈ (0, 1), \phantom{3} θ > −2$

I have calculated the statistical expectation of the maximum likelihood estimator calculating the distribution of the sum of the logarithms but I do not know how to continue

1

There are 1 best solutions below

0
On BEST ANSWER

briefly Explanation of the solution

  • the given density is a known density: $X\sim Beta[\theta+2;1]$. It is very easy to check that this family of distributions belong to the Exponential Family and so the Canonical Statistic is Complete, Sufficient (and minimal) Statistic

The canonical statistic for this family is $S=\sum_x logx$

  • With the usual method, let's derive the MLE estimator of $\theta$ that results

$$\hat{\theta}=\frac{n}{-\sum_x logx}-2$$

  • Now it is necessary to check if the MLE estimatori is unbiased; if biased we must correct it in order to eliminate the bias.

to do that, fisrt we observe that $Y=-log X \sim Exp(\theta+2)$. It is very easy to verify this with the Fundamental Tranformation Theorem.

Now, as Exp is a particuar Gamma distribution, we have that

$$\frac{1}{-\sum_x log x}\sim Inverse Gamma$$

and so

$$\mathbb{E}[\frac{1}{-\sum_x log x}]=\frac{\theta+2}{n-1}$$

concluding....

$$\mathbb{E}(\hat{\theta})=\frac{n}{n-1}(\theta+2)-2$$

which implies that

$$T=\frac{n-1}{-\sum_x log x}-2$$

Estimator based on the MLE as requested by the exercise is

  • Unbiased Estimator for $\theta$,
  • Function of S, Complete and Sufficient Statistic

  • T is UMVUE by Lehmann Scheffé Lemma