Finding the Mean Squared Error of Estimator

230 Views Asked by At

The positive random variables $X_{1}, X_{2},...X_{n}$ are independent and identically distributed as $Ge(\theta)$.

The maximum likelihood estimator of $\psi = \frac{(1 - \theta)}{\theta}$ is the sample mean $\bar X$.

Prove that the alternative estimator $t(X) = \frac{n \bar X}{n + 1}$ has a smaller mean squared error for estimating $\psi$ than $\bar X$, for all $\theta$.

My attempt:

$bias (\theta) = E(\hat \theta) - \theta$

$bias (\psi) = E(\hat \psi) - \psi$

$E(\bar \psi) = \frac{1 - E(\theta)}{E(\theta)}$

$E(\theta) = \frac{1}{\bar X}$

$E(\bar \psi) = \bar X - 1$

And then I get lost..

Please may someone demonstrate how to find the MSE for one of these estimators and then I can do the other one myself?