Given a random sample of $X_1, X_2, ..., X_n$ of the negative binomial distribution $Nb(N,p)$, I am instructed to find the UMVUE for the parameter $\frac{1-p}{p}$.
In general, the mean $\mu$ of the negative binomial is given by the formula $\frac{N(1-p)}{p}$, and the UMVUE for $\mu$ is the sample mean $\overline X$. So essentially I'm asked to find the UMVUE for the parameter $\frac{\mu}{N}$.
So, would the UMVUE for $\frac{\mu}{N}$ be $\frac{\overline X}{N}$? This seems intuitively correct to me -since it indeed is an Unbiased estimator of $\frac{\mu}{N}$, and its variance is dropping by a lot (it's inversely proportional to $N^2$)- but I can't seem to find a way to prove this (or get to any result at all) using Lehmann–Scheffé theorem.
Any help would be greatly appreciated!
To apply the Lehman-Scheffé Theorem, we require a complete sufficient statistic $S(\boldsymbol{x})$ for $\theta=(1-p)/p$ and an unbiased estimator $T(S)$ of $\theta$. Starting from the pmf of our data: \begin{equation} P(X_i=x_i)=\binom{x_i+N-1}{x_i}(1-p)^{x_i} p^N. \end{equation} Consider two sample points $\boldsymbol{x}$ and $\boldsymbol{y}$. Since \begin{equation} \frac{P(\boldsymbol{x};p)}{P(\boldsymbol{y};p)}=\frac{\prod_{i=1}^{n}\binom{x_i+N-1}{x_i}(1-p)^{x_i} p^N}{\prod_{i=1}^{n} \binom{y_i+N-1}{y_i}(1-p)^{y_i} p^N}\propto (1-p)^{\sum_{i=1}^{n} x_i-\sum_{i=1}^{n} y_i}, \end{equation} does not depend on $p$ if and only if $\sum_{i=1}^{n} x_i=\sum_{i=1}^{n} y_i$, we may conclude that $S(\boldsymbol{x})=\sum_{i=1}^{n} x_i$ is a minimally sufficient statistic for $p$ by a result of Lehman and Scheffé ($1950$).
Noting that $X_i$ is in the exponential family, \begin{align} P(x_i;p)&=e^{\ln \phi(x_i)+x_i\ln(1-p)+N\ln p},\\ &=e^{h(x_i)+A(p)t(x_i)+B(p)}, \end{align} we have that $\sum_{i=1}^{n} t(x_i)=\sum_{i=1}^{n} x_i=S(\boldsymbol{x})$ is a complete statistic, since the parameter space of $p$ contains an open set in $\mathbb{R}$ (this is a well known result in most mathematical statistics textbooks).
The conditions used above hold when considering any 'nice' transformation of $p$ (we may have written our pmf in terms of $\theta$ and derived the same results). So, $S(\boldsymbol{x})=\sum_{i=1}^{n} x_i$ is a complete sufficient statistic for $\theta$.
Now let's consider your claim that $T(S)=\bar{X}/N=S/nN$ is the MVUE (one may also find such a $T$ from the Rao-Blackwell Theorem). Since \begin{align} E\left[T(S)\right]&=E\left[\frac{\sum_{i=1}^{n}x_i}{nN}\right],\\ &=\frac{1}{nN}\sum_{i=1}^{n}E[x_i],\\ &=\frac{1}{nN}\frac{Nn(1-p)}{p},\\ &=\frac{1-p}{p},\\ &=\theta, \end{align} we have that $T(S)$ is an unbiased estimator for $\theta$ and is a function of the complete sufficient statistic $S$ for $\theta$. Hence, by the Lehman-Scheffé Theorem, $T(S)$ is the UMVUE for $\theta$.