The random variable $X$ has pdf $f_{\theta}(x)=\theta x^{\theta-1}, \ > 0<x<1,$ for an unknown parameter $\theta>0.$ Assume that $x_1,...,x_n$ measure data from a sample on $X$.
a) Compute the ML-estimator $\hat{\theta}$ of $\theta.$
b) Compute $E[\ln X]$ as a function of $\theta$ if $X$ is distributed according to the above.
c) Is $\hat{\theta}$ unbiased?
Long story short, a) and b) I have computed to be
a): $$\hat{\theta}=-\frac{n}{\sum_{k=1}^n\ln{X_k}},$$
b): $$E[\ln X]=-\frac{1}{\theta}.$$
This is correct as far as the book. Now to the real problem. Here is what I did on c):
$\hat{\theta}$ is unbiased iff $E[\hat{\theta}]=\theta.$ Let's check if it's true. We have that
$$E[\hat{\theta}]=E\left[\frac{-n}{\sum_{k=1}^n\ln{X_k}}\right]=\frac{-n}{\sum_{k=1}^nE[\ln{X_k}]}=\frac{-n}{-\frac{n}{\theta}}=\theta.\tag1$$
Yes, it is unbiased.
Correct answer: No, it is not unbiased. Take for example $\theta = 1$ and $n=1$, then $\hat{\theta}=1/\ln{X}$ and
$$E[\hat{\theta}]=-E\left[\frac{1}{\ln{X}}\right]=\infty.$$
Question:
What is wrong with $(1)?$
$E(\frac{1}{X})$ is generally not equal to $\frac{1}{E(X)}$.