Let $X$ be negatively binomially distributed to parameters $n \in \mathbb{N}$ and $p \in] 0,1[$, i.e. $$ \mathbb{P}(X=k)=\binom{n+k-1}{k} p^n(1-p)^k, \quad k \in \mathbb{N}_0. $$ Let $n \in \mathbb{N}$ be known and $p \in\, ]0,1[$ be unknown. The maximum likelihood estimator for $p$ is given by $$ \widehat{p}(k)=\frac{n}{k+n}. $$
How can you show that
$1 / \widehat{p}$ is unbiased for $\frac{1}{p}$, where $\widehat{p}$ denotes the maximum likelihood estimator from part (a).
The inverse of the estimator is unbiased for $1/p$. Let us note that $$ \begin{aligned} \mathbb{E}\left[\frac{n+\overline{X}}{n}\right]&=\mathbb{E}\left[\frac{n+\frac{1}{n} \sum_{i=1}^n X_i}{n}\right]=\mathbb{E}\left[1+\frac{\frac{1}{n} \sum_{i=1}^n X_i}{n}\right]=1+\frac{1}{n^2}\mathbb{E}\left[\sum_{i=1}^n X_i\right]\\ &=1+\frac{1}{n^2}\sum_{i=1}^n \mathbb{E}\left[X_i\right]=1+\frac{1}{n^2}n\mathbb{E}\left[X_1\right]=1+\frac{1}{n}\mathbb{E}\left[X_1\right]... \end{aligned} $$ thanks for the help the rest is clear.