Finding UMVUE for $p^t$ associated with a negative binomial distribution

2.1k Views Asked by At

Let $X$ be a random variable having the negative binomial distribution with $$ P(X=x)=\left(\begin{array}{c} x-1 \\ r-1 \end{array}\right) p^{r}(1-p)^{x-r}, x=r, r+1 \ldots $$ where $p \in(0,1)$ and $r$ is a known positive integer.

Find the UMVUE of $p^{t},$ where $t$ is a positive integer and $t<r$.

I tried to use a direct approach to solve this problem but got stucked. I am wondering how should I approach this question using E(T|S) where T is the unbiased estimator and S is a complete and minimal sufficient statistic?


following what was suggested, I got $$g(j)=\frac{(j-t-1)!(r-1)!}{(r-t-1)!(j-1)!}, j=r,r+1,....$$


This question has been well-addressed, here's a related question: Find EMVUE of var(X) and log(p) where p is the parameter for negative binomial distribution

2

There are 2 best solutions below

6
On BEST ANSWER

Since $X$ is a complete sufficient statistic for $p$, all you need is an unbiased estimator of $p^t$ based on $X$. This estimator would be the UMVUE of $p^t$ by Lehmann-Scheffé theorem.

So take any function $g(X)$ which is unbiased for $p^t$ for every $p\in (0,1)$ and solve for $g$.

You have

$$ E\left[g(X)\right]=\sum_{j=r}^\infty g(j) \binom{j-1}{r-1}p^r(1-p)^{j-r} =p^t\quad,\forall\,p\in (0,1)$$

Taking $q=1-p$, this implies

$$ \sum_{j=r}^\infty g(j) \binom{j-1}{r-1}q^j =\frac{q^r}{(1-q)^{r-t}} =\sum_{k=r-t}^\infty \binom{k-1}{r-t-1}q^{k+t} \quad,\forall\,q \in(0,1) \tag{$\star$} $$

The infinite series expansion in the last step follows from the fact that $$\sum_{k}P(X=k)=1\implies \sum_{k=r}^\infty \binom{k-1}{r-1}q^k=\left(\frac{q}{1-q}\right)^r$$

This can also be shown as a separate identity.

Finally compare coefficients of $q^j$ from both sides of $(\star)$ to find $g(\cdot)$.

6
On

Yes, I agree with Rao Blackwell. This is what I would do:

First: Observe that your negative binomial is the sum of $r$ iid geometric, say the Statistical Model is a Geometric replicated $r$ times.

$$P[Y=y]=p(1-p)^{y-1}$$,

$y=1,2,3,...$

Now I would set

$T=$ unbiased estimator for $p^t$

$$T=\mathbb{1}_{\{1\}}(Y_1)\cdot \mathbb{1}_{\{1\}}(Y_2)\cdot ... \cdot \mathbb{1}_{\{1\}}(Y_t)= \begin{cases} 1-p^t, & \text{if $T=0$ } \\ p^t, & \text{if $T=1$ } \end{cases}$$

$S=$ Sufficient and Complete statistic for the model

$$S=\sum_{i=1}^{r} Y_i$$


I found

$$\mathbb{E}[T|S]=\mathbb{E}[T=1|S=s]=\frac{p^t\binom{s-t-1}{r-t-1}p^{r-t}(1-p)^{s-r}}{\binom{s-1}{r-1}p^{r}(1-p)^{s-r}}=\frac{(s-t-1)!(r-1)!}{(s-1)!(r-t-1)!}$$