Minimum variance unbiased estimator of exponential distribution

729 Views Asked by At

The given model is $\text{Exp}(\mu,\sigma),\;\mu\in\Bbb{R},\sigma\gt0$ whose pdf is

$f(x\text{;}\theta)={1\over \sigma}e^{-{{(x-\mu)}\over \sigma}}I_{(\mu,\infty)}(x)$

I easily found $(X_{(1)},\bar{X}-X_{(1)})'$ is CSS for $\theta=(\mu,\sigma)'$ with the sample size $n$

The problem is, the parameter to be estimated is $\eta=P_{\theta}(X_{1}\gt a)\;(a\in\Bbb{R}\text{ : given})$, not $\theta$

I'm trying to solve it with Beta distribution as an ancillary statistic, applying Lehmann-Scheffe, but it doesn't work well

$1)\;\;$I think ${X_{1}-X_{(1)}\over \bar{X}-X_{(1)}}\sim B(1,n-2)$ is an ancillary statistic for $\theta$, is it right?

$2)\;\;$If my guess is wrong(or too difficult to calculate an ancillary statistic), what is the key of this problem?

1

There are 1 best solutions below

0
On BEST ANSWER

I will use the more common notations, i.e., $1/\sigma = \lambda$ and $\mu = \gamma$, hence $$ \mathbb{P}(X>a)= \exp\{-\lambda(a-\gamma)\}, $$ hence the MLE is $$ \hat{P}=\exp\{-\frac{1}{\bar{X}_n}(a-X_{(1)})\}. $$ This is a biased estimator, so you can find its expectation using the joint probability function of $\bar{X_n}$ and $X_{(1)}$ and then correcting the bias (this is basically an application of the Lehmann-Scehffe theorem. I'm not sure that this is an easy exercise. However, finding UMVU estimators is an old-fashion problem in parametric statistics. You can find here https://projecteuclid.org/download/pdf_1/euclid.aoms/1177706256 in eq. (7.9) a UMVUE of the tail probability for $\lambda = 1$ or use Thoerem~3 in order to construct an UMVUE for any functional of an exponential shifted distribution (in exponential distribution, truncation is equivalent to shifting, therefore you can apply all the result from this paper).