Hello I am trying determine which estimator is a more efficient estimator for the parameter $\theta$, the methods of moments estimator or maximum likelihood estimator for the distribution of $$f(x;\theta) = \frac 1 \theta \left(\frac 1 x\right)^{\frac 1 \theta + 1} $$ for $x > 1$, $0 < \theta < 1$.
Using the first moment I found that $\bar{x} = \frac{1}{1 - \theta}$ which implies the method of moments estimator is $\hat\theta_1 = 1 - \frac 1 {\bar{x}}$. I also found the maximum likelihood estimator to be $\hat\theta_2 =\bar{y}$ for $y = \ln x_i$ (the maximum likelihood estimator I found was just the mean of the natural logarithms of all the $x_i$ in this distribution). I also found that the estimator $\hat\theta_1$ is biased and $\hat\theta_2$ is unbiased and happens to be a minimum variance unbiased estimator of $\theta$. By definition a parameter estimator is supposed to be relatively more efficient if it has a smaller variance than the other estimator and by the Cramer-Rao lower bound theorem I know that the maximum likelihood estimator $\hat\theta_2$ in this distribution achieves the smallest possible variance in all of the unbiased estimators.
The difficulty I am having in determining which of these two estimators is more efficient (has a smaller variance) is computing the variance for the method of moments estimator $\hat\theta_1 = 1 - \frac{1}{\bar{x}}$, and because this estimator is biased I believe it is possible for it to have a smaller variance than the maximum likelihood estimator $\hat\theta_2$. Is there any way to determine which of these two estimators is a relatively more efficient estimator of the parameter $\theta$ without having to compute the variance of the biased estimator $\hat\theta_1$? And if not, how would I go about computing the variance for $\hat\theta_1 = 1 - \frac{1}{\bar{x}}$?