Let $X_1, \dots, X_n$ denote a random sample from the PDF
$$f_{\varphi}(x)= \begin{cases} \varphi x^{\varphi - 1} &\text{if}\, 0 < x < 1, \varphi > 0\\ 0 &\text{otherwise} \end{cases}$$
This density function is a member of the one-parameter exponential family.
Let $A_i = - \log(X_i)$, where $A_i$ has an exponential distribution. This means that $2\varphi \sum_{i = 1}^n A_i$ has a $\chi^2$ distribution with $2n$ degrees of freedom. Furthermore, I have that
$$ E\left[ \left( 2\varphi \sum_{i = 1}^n A_i \right)^{-1} \right] = \dfrac{1}{2n - 2} $$
and
$$\text{Var} \left( \dfrac{1}{2 \varphi \sum_{i = 1}^n A_i} \right) = \dfrac{1}{4(n - 1)^2(n - 2)}$$
Apparently, using this information, we can conclude that $\dfrac{n - 1}{\sum_{i = 1}^n A_i} = \dfrac{n - 1}{- \sum_{i = 1}^n \log(X_i)}$ is a UMVUE for $\varphi$. However, I am not able to follow this reasoning for how they calculated the UMVUE (nor for how they concluded that this is a UMVUE). If I had to guess, it seems to me that they might have done some kind of bias correction at $\dfrac{n - 1}{\sum_{i = 1}^n A_i} = \dfrac{n - 1}{- \sum_{i = 1}^n \log(X_i)}$, but I'm honestly not sure. What is the reasoning used here for why $\dfrac{n - 1}{\sum_{i = 1}^n A_i} = \dfrac{n - 1}{- \sum_{i = 1}^n \log(X_i)}$ is a UMVUE for $\varphi$? I'd greatly appreciate it if someone would please take the time to fill in the gaps and explain this reasoning so that I may understand it.
Note that the density can be written $$f(x \mid \varphi) = \varphi \frac 1 x e^{\varphi \log x}$$
with the constraints on $x,\varphi$ as given. By the factorization theorem, $\log x $ is a sufficient statistic for $\varphi$. For a sample, we have the likelihood to be $$f(\textbf x \mid \varphi) = \varphi^n\frac 1 {x^n}e^{\varphi\sum_{i=1}^n\log x}$$
and $\sum_i \log x$ is sufficient. Because the exponential family is full rank (not curved), then $\sum_i\log x$ is complete as well (could someone verify this? I'm not 100% sure on this part). Anyway, since we have a CSS (complete sufficient statistic) by the Lehmann-Scheffe Theorem any function of the CSS whose expected value is $\varphi$ will be a UMVU of $\varphi$.
YOu have written that $$\begin{split}E\left[\left(-2\varphi\sum_i \log X_i\right)^{-1}\right]&=\frac 1 {2n-2}\\ \mathbb E\left[\frac{n-1}{-\sum_i \log X_i}\right]=\varphi\end{split}$$