Let $X_1, X_2, \dots, X_n$ be a random sample from the distribution with pdf $$f(x; \theta) = \theta x^{\theta – 1} I_{(0,1)}(x)$$ for $\theta > 0$
(a) Find the UMVUE for $\dfrac{1}{\theta}$.
(b) Find the UMVUE for $\left(\dfrac{\theta}{\theta + 1}\right)^n$.
I found the Fisher Information as $\dfrac{n}{\theta^2} = I(\theta)$ so the CRLB is $\dfrac{\theta^2}{n} $
I computed the joint pdf and rewrote it to be $$f(\bar x; \theta) = I_{(0,1)}(x_i) \theta^n e^{(\theta – 1)\sum_{i=0}^n \ln(x_i)} $$
Which would prove by exponential-family factorization that $\sum_{i=0}^n \ln(x_i)$ is a sufficient and complete statistic for the distribution right?
How do I find the UMVUE from here? I've seen a similar problem that relied on Pareto distribution but this is not Pareto since the exponent is positive right?
This is a Beta distribution with the corrected support $(0,1)$.
Indeed as the density is a member of a full rank exponential family, $T=\sum\limits_{i=1}^n\ln X_i$ is a complete sufficient statistic for this family of distributions. For the UMVUE you have Lehmann-Scheffe theorem, which says that an unbiased estimator based on a complete sufficient statistic is the minimum variance unbiased estimator.
It is easy to verify $E_{\theta}[-\ln X_i]=\frac1{\theta}$ for all $\theta>0$ via direct integration.
So you have $$E_{\theta}\left[-\frac Tn \right]=\frac1{\theta}\quad,\,\forall\,\theta$$
Again, as mentioned in comments, $$E_{\theta}\left[\prod_{i=1}^n X_i\right]=\left(E_{\theta}(X_1)\right)^n=\left(\frac{\theta}{1+\theta}\right)^n\quad,\,\forall\,\theta$$
Now as $T=\ln \left(\prod\limits_{i=1}^n X_i\right)$ is complete sufficient, so is the one-to-one function $e^T=\prod\limits_{i=1}^n X_i$.
Hence conclude.