UMVUE for pdf $f_{\theta}(x) = \theta e^{-\theta x}, x>0$

2k Views Asked by At

Let $X_1,\ldots,X_n$ be a random sample from a pdf $f_{\theta}(x) = \begin{cases} \theta e^{-\theta x}, & x>0 \\ 0, & \text{otherwise} \end{cases}$, where $\theta>0$ is an unknown parameter.

Then, the uniform minimum variance unbiased estimator for $\dfrac{1}{\theta}$ is

(A)$\dfrac{1}{\bar{X_n}}$

(B) $\displaystyle\sum_{i=1}^{n}X_i$

(C) $\bar{X_n}$

(D) $\dfrac{1}{\displaystyle\sum_{i=1}^{n}X_i}$

MY STEPS:

Taking the Expectation, $E_\theta(X)=\displaystyle\int_{0}^{\infty}xf(x)\;dx$

$$E_\theta(X)=\theta\int_0^\infty x e^{-\theta x}\;dx=\dfrac{1}{\theta}=\bar{X_n}$$

Hence, option (C) should be correct.

Did I solve this correctly ? Please help me confirm my solution.

1

There are 1 best solutions below

6
On BEST ANSWER

Yes, correct procedure and answer.

Some background: This is the exponential distribution and it is 'parameterized' by the rate (here $\theta$, perhaps most often $\lambda$). It can also be parameterized by its mean as $f_\mu(x) = (1/\mu) e^{-x/\mu}$, for $x > 0$.

In parameter estimation, that gives rise to which estimators are unbiased for which parameters. You have just shown that $\bar X$ is unbiased for parameter $\mu = 1/\theta$. It is also UMVUE for $\mu$ because it is not only unbiased, but also based on the sufficient statistic.

However, $1/\bar X$ is not UMVUE for $\theta$. It is based on the sufficient statistic, but you can show that it is biased. That is $E(1/\bar X) \ne \theta.$ One says that 'expectation is a linear operator', and that 'unbiasedness does not survive nonlinear transformations' (such as taking the reciprocal).

See comments by @Michael Hardy and @heropup for important clarifications.