Unbiased Estimator for $\mu$ and $\sigma$

478 Views Asked by At

Suppose that $X_{1}, \ldots ,X_{n}$ iid random variables, whose probability distribution belongs to the family of distributions with distribution function given by $f(x:\mu, \sigma)=\frac{1}{\sigma}e^{\frac{x-\mu}{\sigma}}$ for $x>\mu,$ $0<\mu<\infty$ and $\sigma>0$. Prove using the factorization theorem that the joint statistical $(X_{(1)}, \sum X_{(i)})$ is sufficient for parameter $\theta=(\mu,\sigma)$, where $X_{(1)}=min(X_{1},\ldots ,X_{n})$

I already found the maximum likelihood estimator both $\mu$ and $\sigma$:

$\widehat{\sigma}=\overline{X}-\overline{X_{(1)}}$

$\widehat{\mu}=X_{(1)}$

Only now I have to find the minimum variance unbiased estimator for $\sigma$ the idea I have is getting the expected value of each of the estimators of $\mu$ and $\sigma$

Could someone please prove that they are unbiased?

Thanks for your help

2

There are 2 best solutions below

0
On

The question asks to show $\boldsymbol T : \mathbb R^n \to \mathbb R^2$ defined by $\boldsymbol T(\boldsymbol X) = (X_{(1)}, \sum X_i)$ is a sufficient statistic. It isn't asking for an unbiased estimator.

To show sufficiency, all we need to do is compute the joint density and use the Factorization theorem, as the question states:

$$\begin{align*} f(\boldsymbol x \mid \mu, \sigma) &= \prod_{i=1}^n \frac{1}{\sigma} e^{-(x_i -\mu)/\sigma} \mathbb 1 (x_i > \mu) \\ &= \sigma^{-n} \exp \left( - \frac{1}{\sigma} \sum_{i=1}^n (x_i - \mu) \right) \mathbb 1 (x_{(1)} > \mu) \\ &= \sigma^{-n} e^{n \mu/\sigma} \exp \left( -\frac{\sum x_i}{\sigma} \right) \mathbb 1 (x_{(1)} > \mu) \\ &= h(\boldsymbol x) g(\mu, \sigma \mid \boldsymbol T(\boldsymbol x)), \end{align*}$$ where $h(\boldsymbol x) = 1$ and $g$ is the above function with the parameters $\mu, \sigma$ depending on the sample $\boldsymbol x$ through the sufficient statistic $\boldsymbol T(\boldsymbol x) = (x_{(1)}, \sum x_i)$. However, that is not to say that such a statistic estimates the parameters. It only shows that $\boldsymbol T$ does not discard any information about the parameters that is contained in the original sample.

2
On

The both estimators are biased. Note that your distribution is an exponential shifted distribution. The shifting parameter is $\mu$. By using the cumulative distribution function of the minimum, i.e., $$ F_{X_{(1)}}(x)=1-(1-F_X(x))^n, $$
you can easily show that $X_{(1)}$ follow exponential distribution with the same shift but the second parameter is $n/\sigma$. Thus, $$ \mathbb E\hat{\mu}= \sigma/n+ \mu > \mu\,\, ,\forall n\in\mathbb N. $$ Similarly, $$ \mathbb E \hat{\sigma} = \mathbb E X + \mathbb E\hat{\mu} = \sigma + \mu-\sigma/n - \mu = \sigma(1-1/n) < \sigma\,\, , \forall n \in \mathbb N. $$