Let $X_1,...,X_n$ be a random sample from the pdf $$f(x|\theta) = \theta x^{\theta-1} , 0 \leq x \leq 1, \theta >0.$$
I found the Maximum-likelihood estimator of $\theta$ is $$\hat{\theta} = \frac{-n}{\sum_{i=1}^N \ln(X_i)}.$$ Can anyone confirm that this is right?
Then, I want to determine whether $\hat{\theta}$ has bias. My approach is to calculate ${\bf E}[\hat{\theta}] = {\bf E}\left[\frac{-n}{\sum_{i=1}^N \ln(X_i)}\right]$...Then I am stuck. Could someone help me with this?
You got the right answer for the MLE. To find the expectation of $\hat{\theta}$, it will help to first find the distribution of the transformation $Y=-\log(X_i)$. It's a well known distribution, and so will be $-\sum \log(X_i)$ as well as $-1/\sum \log(X_i)$.
You have already found that the distribution of $Y_i=-\log(X_i)$ is exponential($1/\theta$).
According to section 4.2 of Pitman, "Probability", the sum of n iid exponential($1/\theta$) distributions is gamma(n, $1/\theta$). Thus, Z =$\sum Y_i=-\sum \log(X_i)$ is distributed gamma(n,$1/\theta$) where the pdf of Z is
$\dfrac{\theta^n}{\Gamma(n)}z^{n-1}e^{-z\theta}$ with mean $E(Z)=n/\theta$.
Notice that since a pdf must integrate to 1, it is straightforward to show
$\int_0^{\infty}z^{n-1}e^{-z\theta}dz=\dfrac{\Gamma(n)}{\theta^n}$. Call this result 1.
This will be useful in the following step to find $E(1/Z)$.
$E(1/Z) = \int_0^{\infty}1/z\dfrac{\theta^n}{\Gamma(n)}z^{n-1}e^{-z\theta}dz$
$= \dfrac{\theta^n}{\Gamma(n)} \int_0^{\infty}z^{(n-1)-1}e^{-z\theta}$
$= \dfrac{\theta^n}{\Gamma(n)}\dfrac{\Gamma(n-1)}{\theta^{n-1}}$ by result 1.
$=\dfrac{\theta\Gamma(n-1)}{\Gamma(n)}$
Now, according to Casella and Berger, Statistical Inference pg 99, a useful property of the gamma function is that
$\Gamma(\alpha+1)=\alpha\Gamma(\alpha)$, thus
$\Gamma(n)=\Gamma((n-1)+1) = (n-1)\Gamma(n-1)$ and
$E(1/Z) = \dfrac{\theta\Gamma(n-1)}{\Gamma(n)} =\dfrac{\theta\Gamma(n-1)}{(n-1)\Gamma(n-1)}= \theta/(n-1) $
It's now straightforward to find $E(n/Z) = n*E(1/Z) = n\theta/(n-1) $.