Let $X_1,X_2,...,X_n$ be a sample from a distribution that has the CDF
$$F(x)=1-\frac{1}{(x+1)^{\mu}}, \quad x>0,$$
where $\mu > 0$ is an unknown parameter. Find the ML-estimatior $\hat{\mu}$ of $\mu$ and determine if this estimator is unbiased.
Solution: Differentiation of the CDF yeilds the PDF
$$f(x)=\frac{\mu}{(x+1)^{\mu+1}}. \tag1$$
The likelyhood function is then
$$L(\mu)=\frac{\mu^n}{\prod_i(x+1)^{\mu+1}}. \tag 2 $$
Taking the logarithm gives
$$l(\theta)=n\ln(\mu)-(\mu+1)\sum_i\ln{(x_i+1)}.\tag3$$
Differentiating gives
$$l'(\mu)=\frac{n}{\mu}-\sum_i\ln(x_i+1)\Rightarrow \hat{\mu}=\frac{n}{\sum_i\ln(x_i+1)}.\tag4$$
The estimator is not unbiased. To see this we can set $n=1$ and assume $\mu=1.$ We then get
$$E[\hat{\mu}]=\int_0^{\infty}P(X>x) \ dx = \int_0^{\infty}\frac{1}{x+1}=\infty \neq \mu.\tag 5$$
Questions:
- I don't understand $(5)$. To me it seems as $$P(X>x)=1-P(X<x)=1-F(x)=\frac{1}{1+x},$$with $\mu=1$. But where does $n=1$ play a role here?
- Should I not be using $\hat{\mu}$ somehow to compute $E[\hat{\mu}]$
The formula $(5)$ seems to be mistaken. It might be intended to compute something like $\mathsf E(X)$, but it is actually computing $\mathsf E(X+1)$. Neither is the desired $\mathsf E(\hat\mu)$. Fortunately, the conclusion holds whatsoever: \begin{align} \mathsf E(\hat\mu)&=\mathsf E\left[\frac1{\ln(X+1)}\right]\\ &=\int_0^\infty\frac1{(x+1)\ln(x+1)}dx\\ &=\ln\ln(x+1)\bigg|^\infty_0=\infty \end{align}