Is $\bar X$ a minimum variance unbiased estimator of $\theta$ in an exponential distribution?

3.9k Views Asked by At

I proceeded by finding $\operatorname E(\bar X).$ I considered $\bar X$ as a constant and simply got the term itself. This should suggest that $\bar X$ is not an unbiased estimator of $\theta$. That's obvious too cuz it's a constant(from what I assumed). However, I'm also aware that $\mu = \theta$ so that's probably not right. I have to demonstrate first that $\bar X$ is an unbiased estimator of $\mu$.

2

There are 2 best solutions below

1
On

You have $$ \bar X = \frac{ X_1+\cdots+X_n } n $$ and for every measurable $A\subseteq[0,\infty)^n,$ $$ \Pr((X_1,\ldots,X_n) \in A) = \int_A e^{-x_1/\theta} \cdots e^{-x_n/\theta} \, \frac{d(x_1,\ldots,x_n)}{\theta^n}. $$ The density is $\dfrac{e^{-(x_1\,+\,\cdots\,+\,x_n)/\theta}}{\theta^n}.$ The fact that the density depends on $(x_1,\ldots,x_n)$ only through $x_1+\cdots+x_n$ is sufficient (but not necessary) to show that $X_1+\cdots+X_n$ is a sufficient statistic for $\theta,$ i.e. the conditional distribution of $(X_1,\ldots,X_n)$ given $X_1+\cdots+X_n$ does not depend on $\theta.$

That $X_1+\cdots+X_n$ is a complete statistic means that there only the identically zero function $g$ satisfies the condition $$ \int_{[0,\infty)^n} g(x_1+\cdots+x_n) e^{-(x_1+\cdots+x_n)/\theta} \,\left(\frac{d(x_1,\ldots,x_n)}{\theta^n}\right) \text{ remains equal to $0$ as $\theta\ge0$ changes}. $$

3
On

Assuming the underlying population is exponential with mean $\theta$, i.e. with density $$f(x;\theta)=\frac{1}{\theta}e^{-x/\theta}\,\mathbb I(x)\quad,\theta>0$$

where $\mathbb I(x)=\begin{cases}1&,\text{ if }x>0\\0&,\text{ otherwise }\end{cases}$.

$(X_1,X_2,\cdots,X_n)$ is a random sample drawn from the above population.

Then, $\displaystyle\mathbb E_{\theta}(\bar X)=\mathbb E_{\theta}\left(\frac{1}{n}\sum_{i=1}^nX_i\right)=\frac{1}{n}\sum_{i=1}^n\mathbb E_{\theta}(X_i)=\frac{n\theta}{n}=\theta$ for all $\theta$.

So as usual we see that the sample mean $\bar X$ is unbiased for the population mean $\theta$.

Now, the joint density of $(X_1,X_2,\cdots,X_n)$ is \begin{align}f_{\theta}(\mathbf x)&=\prod_{i=1}^nf(x_i;\theta)\\&=\frac{1}{\theta^n}\exp\left(-\frac{1}{\theta}\sum_{i=1}^nx_i\right)\prod_{i=1}^n\mathbb I(x_i)\\\implies \ln f_{\theta}(\mathbf x)&=-n\ln \theta-\frac{1}{\theta}\sum_{i=1}^nx_i+\sum_{i=1}^n\ln \mathbb I(x_i)\\\implies\frac{\partial}{\partial\theta}\ln f_{\theta}(\mathbf x)&=\frac{-n}{\theta}+\frac{n\bar x}{\theta^2}\\&=\frac{n}{\theta^2}\left(\bar x-\theta\right)\end{align}

Thus we have expressed the score function in the form

$$\frac{\partial}{\partial\theta}\ln f_{\theta}(\mathbf x)=k(\theta)(T(\mathbf x)-\theta)\tag{1}$$

which is the equality condition in the Cramér-Rao inequality.

Hence we see that

  • $\bar X$ is an unbiased estimator of $\theta$.
  • $\bar X$ is the statistic $T(\mathbf X)$ which satisfies the equality condition $(1)$ of the Cramér-Rao inequality. That is, variance of $\bar X$ attains the Cramér-Rao lower bound for $\theta$.

These two facts imply that $\bar X$ is the UMVUE of $\theta$.


Here we have exploited a corollary of the Cramér-Rao inequality, which says that for a family of distributions parametrised by $\theta$ (assuming regularity conditions of Cramér-Rao inequality to hold), if a statistic $T$ is unbiased for $\theta$ and if it satisfies $(1)$, then $T$ must be the uniformly minimum variance unbiased estimator of $\theta$. This holds for a function of $\theta$ also. Needless to say, this does not work in every problem. In all those cases, one has to use the theory of completeness and sufficiency as mentioned in the other answer.

If you want to do this problem the usual way, you would have to prove that $\sum_{i=1}^n X_i$ is a complete sufficient statistic for the family of distributions, and that $\bar X$, a function of the complete sufficient statistic, is an unbiased estimator of $\theta$. So by the Lehmann-Scheffé theorem, $\bar X$ is the UMVUE of $\theta$.