Thus we have $X_1, ...,X_n \in {\rm Geom}(p)-i.i.d$ random variables. We have samples - $x_1, ... , x_n$, $P(X_i=K)=p(1-p)^{k}$
I got log likehood estimation = $\frac{1}{\bar{x}+1}$.
I need to show that this is unbiased estimaition of $p$. But I don't have any ideas.
I tried: $$E\left[\frac{1}{\bar x + 1}\right] = \frac{1}{E[\bar x]+ 1} = \frac{1}{x_1P(x_1)+\cdots+x_nP(x_n)}=\frac{1}{x_1p(1-p)^{x_1}+\cdots+x_np(1-p)^{x_n}}=???$$
Note that $E[X_i]=E[\bar X]=(1-p)/p$ and the MLE is biased by Jensen's inequality:
$$E\left[{1 \over \bar X+1}\right]>{1\over E[\bar X]+1}=p.$$