Suppose $ X_1,\ldots,X_n$ are i.i.d. random variables with density: $$f(x_i;\theta)=\theta x_i^{-2}$$ $$x_i>\theta$$$$\theta>0$$
The smallest order statistic $X_{(1)}$ is sufficient and complete for $\theta$ . We would estimate $P( X_1> t)$ for constant $t> 0$. From the fact that $I ( X_1> t)$ is an unbiased estimator for $P ( X_1>t)$ , find the estimator UMVUE $P ( X_1> t)$.
to Find UMVUE estimator is not an easy task. How can I solve this problem ?
The Lehmann–Scheffé theorem says that the conditional expected value of any unbiased estimator given the value of a complete sufficient statistic is the UMVUE. So the first question is whether there is a complete sufficient statistic. The joint density is \begin{align} f(x_1,\ldots,x_n) & = \theta^n x_1^{-2} \cdots x_n^{-2} = \theta^n(x_1\cdots x_n)^{-2} \quad \text{for } x_1,\ldots,x_n>\theta \\[10pt] & = \theta^n(x_1\cdots x_n)^{-2} I_{[\theta,\infty)}(\min\{x_1,\ldots,x_n\}). \end{align} This has one factor, namely $\theta^nI_{[\theta,\infty)}(\min\{x_1,\ldots,x_n\})$, that depends on $(x_1,\ldots,x_n)$ only through $\min\{x_1,\ldots,x_n\}$, and another factor (in this case $(x_1\cdots x_n)^{-2}$) that does not depend on $\theta$. This is a Fisher factorization; therefore $\min\{X_1,\ldots,X_n\}$ is sufficient for $\theta$.
If that statistic is complete, then the UMVUE will be the conditional expected value $$ \operatorname{E}( I(X_1>t) \mid \min\{X_1,\ldots,X_n\}) = \Pr(X_1>t\mid \min\{X_1,\ldots,X_n\}). $$ (That that random variable is a statistic depends on the fact that $\min$ is sufficient.)
Notice that if $\min\{X_1,\ldots,X_n\}>t$ then a fortiori $X_1>t$. Hence $\Pr(X_1>t\mid \min\{X_1,\ldots,X_n\}=x) = 1$ if $x>t$.
If $x<t$, then how do we find $\Pr(X_1>t\mid \min\{X_1,\ldots,X_n\}=x)$?
\begin{align} & \Pr(X_1>t\mid \min = x) \\[10pt] = {} & \Pr(X_1=\min)\Pr(X_1>t\mid\min=x) + \Pr(X_1\ne\min)\Pr(X_1>t\mid\min=x) \\[10pt] = {} & \frac 1 n \cdot 0 + \frac{n-1}n\cdot \Big( \cdots\cdots \Big) \end{align}
[to be continued$\,\ldots\ldots$]