UMVUE of $\theta$ for Negative Binomial family

241 Views Asked by At

Consider the negative binomial distribution, with pmf: $$P(X=x|\theta,k) = \frac{\Gamma(x+k)}{x! \Gamma (k)} \left(\frac{k}{k+\theta}\right)^k \left(\frac{\theta}{k+\theta}\right)^x, \ x=0,1,..$$

Let $X_1,...,X_n$ be a random sample from this distribution. Find the UMVUE of $\theta$.

Here, we know that $\sum_{i=1}^{n} X_i$ is sufficient and complete for $\theta$. So, all we need to find is an unbiased estimator which is a function of $\sum X_i$.

Using the fact that $T=\sum X_i \sim \text{Negative Binomial}(nk, \frac{k}{k+\theta})$, we need to find $g(T)$ such that $E(g(T))=\theta$.

Now, $E(g(T))=\sum_{t=0}^{\infty} g(t) {t+nk-1 \choose nk-1}\left(1-\frac{k}{k+\theta} \right)^t \left(\frac{k}{k+\theta}\right)^{nk}=\theta$. Simplifying, we get:

$$\sum_{t=0}^{\infty} g(t){t+nk-1 \choose nk-1}\theta^t (k+\theta)^{-t} = \frac{1}{k^{nk}}[\theta(k+\theta)^{nk}]$$ My idea is to expand the negative power on the LHS and then compare the coefficients of $\theta^j$ on both sides. However, things get complicated after this. Is there any easy way out?

1

There are 1 best solutions below

0
On BEST ANSWER

With your $(\theta, k)$-parameterisation, it seems that $X_i$ counts the number of successes $\in \{0, 1, 2, \dots \}$ until $k$ failures, where each success has probability $p = \frac{\theta}{k + \theta}$ and each failure has probability $1 - p = \frac{k}{k + \theta}$. Thus, using an parameterisation in terms of $(k, p)$, $X_i \sim \mathrm{NB}(k, p)$ with $p = \frac{\theta}{k + \theta}$.

Note that the mean of $X_i$ is $\frac{pk}{1-p} = \theta$. Since you identified that $T = \sum_{i=1}^n X_i \sim \mathrm{NB}(nk, p)$, we know that $\mathbb{E}[T] = \frac{pnk}{1-p} = \theta n$ (which also follows from linearity of expectation), and so $\mathbb{E}\left[\frac{T}{n}\right] = \theta$, where $\frac{T}{n} = \overline{T}$ is nothing but the sample mean.