Let $f(x, \theta) = \frac{1}{\theta} x^{\frac{1-\theta}{\theta}}$, where $0 < x < 1$ and $\theta > 0$. Let $X_1, \dots, X_n$ be iid with density $f$.
Taking the log likelihood, I found $$ l(\theta) = -n\ln{\theta} + \frac{1-\theta}{\theta}\sum_{i=1}^n{\ln{X_i}} $$ $$ l'(\theta) = -\frac{n}{\theta} - \frac{1}{\theta^2}\sum{\ln{X_i}}$$
Setting $l'(\theta) = 0$ yields the MLE of $$\hat{\theta} = -\frac{1}{n}\sum{\ln(X_i)}$$
But I don't know how to show that $E(\hat{\theta}) = \theta$, i.e. it's an unbiased estimator. I tried taking the estimate of both sides as follows:
$$ E(\hat{\theta}) = -\frac{1}{n}\sum{\ln{E(X_i)}} = -\frac{1}{n}\sum{\ln{\frac{1}{1 + \theta}}} = \ln(1 + \theta)$$ where I found $E(X_i)$ as follows:
$$ E(X) = \int_0^1{x f(x, \theta) dx} = \frac{1}{\theta}\int_0^1{x^{1/\theta}} = \frac{1}{1 + \theta} x^{\frac{1 + \theta} {\theta}}{\huge\rvert}_0^1 = \frac{1}{1 + \theta}$$
Okay, thanks to @CommongerG for pointing out my mistake in his answer. We have $$E(\hat{\theta}) = -\frac{1}{n}\sum{E(\ln{X})} = -E(\ln{X})$$ So I need to show $E(\ln{X}) = -\theta$. Well,
$$ E(\ln{X}) = \int_0^1{\ln{x} f(x, \theta) dx} = \theta^{-1}\int_0^1{x^{\frac{1-\theta}{\theta}}\ln{x}dx}$$
Thank you @MlleM for pointing out my calculus error. I finally got the desired result. Letting $u = \ln{x}$ and $dv = \theta^{-1}x^{\frac{1-\theta}{\theta}} dx$, we have $du = \frac{dx}{x}$ and $v = x^{\frac{1}{\theta}}$. Then this solves as:
$$ E(\ln{X}) = x^{\frac{1}{\theta}}\ln(x){\large\rvert}_0^1 - \int_0^1{x^{\frac{1}{\theta}}dx}{\huge\rvert}_0^1 = 0 - \theta x^{\frac{1}{\theta}} {\huge\rvert}_0^1 = - \theta$$
Your mistake is here: $$ln\left(E(X_i)\right)\neq E(ln(X_i))$$
$$ E(\hat{\theta}) = -\frac{1}{n}\sum{E(ln(X_i))}=-\frac{n}{n}E(ln(X_i))=-E(\ln(X_i))$$
Now you just have to show $E(\ln{X_i})=-\theta$