Applying an unbiased estimator by using a linear function

330 Views Asked by At

Let $X_1,...,X_n$ by a a sample from a Uniform Distribution $(0,\theta)$ where $\theta > 0$ is an unknown parameter.

How do I construct an unbiased estimator by applying a linear function to each of the sample mean $\bar{X}$ and nth order statics $X_{(n)}$.

I have calculated the Expectation and Variance of $\bar{X}$:

$E(\bar{X})=\frac{\theta}{2}$

$Var(\bar{X})=\frac{\theta^2}{12}$

I have calculated the Expectation and Variance of $X_{(n)}$:

$E(X_{(n)})=\frac{n}{n+1}\theta$

$Var(X_{(n)})=\frac{n}{(n+1)^2}\theta^2$

1

There are 1 best solutions below

5
On BEST ANSWER

An estimator $\hat \theta$ of some parameter $\theta$ is unbiased if $$\operatorname{E}[\hat \theta] = \theta.$$ That is to say, the expected value of the estimator is equal to the parameter. Since you found the expectation of the sample mean $\bar X$ is $$\operatorname{E}[\bar X] = \theta/2,$$ is there a constant that you can multiply $\bar X$ by that would change the right hand side to $\theta$? Recall that $$\operatorname{E}[cX] = c \operatorname{E}[X]$$ for some fixed constant $c$. Once you find this constant, then this will provide an estimator for $\theta$ that is based on the sample mean.

Similarly, for the maximum order statistic $X_{(n)}$, what could you multiply by? Note here that $n$ is still a constant--it is presumed that the sample size is fixed and known, thus with respect to the expectation of a sample statistic, it is also constant.