Let $X_1,\cdots, X_n$ be a random sample with pdf
$$p(x\mid \theta) = \theta x^{-2}, \text{ with } 0<\theta\le x < \infty$$
Use moment method to estimate $\theta$.
This problem is 7.6 from Casella's Statistical Inference. Strangely, I found that the moment does not exist. For example,
$$E (X) = \int_\theta^\infty \theta x^{-1} \,dx = \theta \ln(x)\biggr|_{\theta}^\infty $$
What should I do in this situation?
Transform your random variable to:
$$ Y =X^{\frac{1}{2}},$$
and find its moment:
$$E(Y) = E \left(X^\frac{1}{2}\right) = \int_\theta^\infty \theta x^{-\frac{3}{2}} \,dx = \left.-2\theta x^{-\frac{1}{2}}\right|_\theta^{+\infty} = 2 \sqrt{\theta}.$$
Given $N$ observation of $X$, an estimator for $\alpha = \sqrt{\theta}$ is:
$$\hat\alpha = \frac{1}{2N} \left(\sum_{i=1}^N \sqrt{X_i}\right),$$
which is unbiased since:
$$E(\hat\alpha) = \frac{1}{2N} N 2 \sqrt{\theta} = \sqrt{\theta}.$$