What is $ g(\theta) $ in a uniform distribution?

330 Views Asked by At

Let $X_1,...X_n$ be independent and $U[0,\theta]$ distributed with $\theta>0$ unknown.

Determine the mean squared errors of the estimators $cX_{(n)}$ for $\theta$, for every value of $c>0$


Now I have found that $var_{\theta} (cX_{(n)}) = \frac{cn\theta}{2}$ and $E_{\theta} (cX_{(n)}=\frac{cn{\theta}^2}{2}$

But $ MSE= var_{\theta} (cX_{(n)})+ (E_{\theta} (cX_{(n)}) - g(\theta) )^2 $ and I do not know what $g(\theta)$ is. Can anybody help me understand how to calculate it?

1

There are 1 best solutions below

0
On BEST ANSWER

The formula for mean squared error is $$\operatorname{MSE}[w(\theta)] = \operatorname{Var}[w(\theta)] + \operatorname{Bias}[w(\theta)]^2,$$ where $w(\theta)$ is an estimator of $\theta$. In this case, $w(\theta) = cX_{(n)}$, some constant $c$ multiplied by the maximum order statistic of the sample.

Since what we are estimating is $\theta$ itself, and not some function of $\theta$, your $g(\theta)$ is simply $\theta$. The idea is to calculate the bias of this estimator as a function of $c$: $$\operatorname{Bias}[cX_{(n)}] = \operatorname{E}[cX_{(n)} - \theta] = c \operatorname{E}[X_{(n)}] - \theta.$$ Then square it, and add to it the variance of the estimator. This gives you the mean squared error for this estimator.

In the general case, sometimes we do not want to estimate $\theta$, but instead we might be interested in a function of the parameter; e.g., $\theta^2$, or $1/\theta$. This is the function $g$ used in your formula. So the bias of some estimator $w(\theta)$ of $g(\theta)$ is then $$\operatorname{E}[w(\theta) - g(\theta)].$$