Originally, I was attempting to find a Method of Moments estimator for $\lambda$ given $X_i \sim Geom(e^{-\lambda})$ s.t. $f_\lambda(x) = e^{-\lambda}(1 - e^{-\lambda})^x$.
I found it to be $\hat{\lambda} = \log\left(1 + \frac{\sum_{i=1}^n X_i}{n}\right)$. I am now attempting to verify if it is unbiased ($E\left[\hat{\lambda}\right] \stackrel{?}{=} \lambda$). This leads to a super nasty summation:
$$ E\left[\log\left(1 + \frac{\sum_{i=1}^n X_i}{n}\right)\right] = \sum_{k=0}^\infty \log\left(1 + \frac{k}{n}\right)\cdot \underbrace{\pmatrix{k + n - 1 \\ k} \cdot (1 - e^{-\lambda})^k e^{-n\lambda}}_\text{Negative binomial pmf} $$
Note that $\sum X_i \sim NB(n, e^{-\lambda})$.
My thoughts: I am convinced that the sum is either divergent or not at all equal to $\lambda$.
- Are there any tools I could use to indirectly state this?
- Is there a feasible pen/paper way to calculate the exact sum or should this be left for something like Mathematica?
Thank you to @StubbornAtom for suggesting Jensen's Inequality. We know that $\log$ is concave thus it follows that $E[g(X)] \leq g(E[X])$:
$$ \begin{aligned} E\left[\log\left(1 + \dfrac{\sum_{i=1}^n X_i}{n}\right)\right] &\leq \log\left(E\left[1 + \dfrac{\sum_{i=1}^n X_i}{n}\right]\right) \quad \text{(Jensen's Inequality)} \\ &=\log\left(1 + \dfrac{E\left[\sum_{i=1}^n X_i\right]}{n}\right) \quad \left(\text{Note: } \sum X_i \sim NB(n, e^{-\lambda} \right) \\ &=\log\left(1 + \dfrac{n\left(1 - e^{-\lambda}\right)/e^{-\lambda}}{n}\right) \\ &= \log\left(1 + \dfrac{1}{e^{-\lambda}} - 1\right) \\ &= \log\left(e^{\lambda}\right) = \lambda \end{aligned} $$
This then implies that the original method of moments estimate $\hat{\lambda}$ is greater than, if not equal to, the RHS of the inequality. For the equality to be true, our transformation must be affine which I do not believe is true in this case (someone can correct me on that).
Thus, $\hat{\lambda}$ is a biased estimator of $\lambda$.