Minimum Variance Estimators that Attain the Cramér-Rao lower bound

1k Views Asked by At

The probability mass function of a negative binomial random variable $X$ is given by $$P(X=x\mid\theta)= {r+x-1 \choose x}(1-\theta)^x\theta^r$$ and $\Bbb E(X\mid \theta)=\frac{r(1-\theta)}{\theta}$.

Does this model admit a minimum variance unbiased estimator that attains the Cramér-Rao lower bound?

What I have tried:

I have worked out the maximum likelihood estimator $\hat{\theta}$ to be $\frac{r}{r+x}$ and the CRLB to be $\Bbb V(\hat{\theta}) \geq \frac{1}{I(\theta)}= \frac{\theta^2-\theta^3}{r}$.

I know that if the estimator attains the CRLB then we know that we have found a Minimum Variance Unbiased Estimator. Also if the variance of the estimator is equal to the CRLB then it is an MVUE. However I am unsure of how to calculate the variance of $\hat{\theta}$.

Thank you for your help!

1

There are 1 best solutions below

2
On BEST ANSWER

I know that if the estimator attains the CRLB then we know that we have found a Minimum Variance Unbiased Estimator. Also if the variance of the estimator is equal to the CRLB then it is an MVUE.

An UMVUE (Unformly Minimum Variance Unbiased Estimator) rarely attains the CRLB.

Moreover: if a family of rv's belongs to the exponential family (as it is the case) always exist an estimator that attains CRLB.

To verify that an estimator attains the lower bound there is a IFF condition:

An estimator $T$ attains CRLB iff there is a function $b(n,\theta)$ such that

$$\sum_{i=1}^n\frac{\partial}{\partial\theta}\log f(x_i;\theta)=b(n,\theta)[T-\mathbb{E}[T]]$$

In your case you have

$$\sum_{i=1}^n\frac{\partial}{\partial\theta}\log f(x_i;\theta)=\frac{nr}{\theta}-\frac{\Sigma_i X_i}{1-\theta}=\underbrace{-\frac{n}{(1-\theta)}}_{=b(n,\theta)}\Bigg[\overline{X}_n-\frac{r(1-\theta)}{\theta}\Bigg]$$

Thus $\overline{X}_n$ is UMVUE for $\mathbb{E}[X]$ and it attains CRLB


References taken from Mood Graybill Boes, McGraw Hill

enter image description here