Estimation of a geometric distribution's parameter by the reciprocal of the sample mean

223 Views Asked by At

I came accross this exercise when studying statistics, but I can't get to what's the solution. The exercise simply asks to show whether the reciprocal of the sample mean is an unbiased estimation for the unknown parameter $p$ if the sample is taken from the distribution $Geo(p)$.

I've tried the following: Let $\underline{x}$ be our sample, with size $n$. Let our estimation be $T(\underline{x}) = \frac{n}{\sum_{i=1}^{n} x_i}$. We need to show that $E(T(\underline{x})) = p$.

The sum of $n$ geometric distributions follows a $NegBinom(n, p)$ distribution, so with this, if I'm not mistaken the mean of our estimation goes like this:

$$E(T(\underline{x})) = n\sum_{i=n}^{\infty}\frac{1}{i}\binom{i}{n}(1-p)^{i-n}p^n = \sum_{i=n}^{\infty}\binom{i-1}{n-1}p^n(1-p)^{i-n}$$

Since our estimation has to be equal $p$, I arranged the two sides as: $\sum_{i=n}^{\infty}\binom{i-1}{n-1}p^n(1-p)^{i-n} = p$. From this, $\sum_{i=n}^{\infty}\binom{i-1}{n-1}p^{n-1}(1-p)^{i-n} = 1$ should be shown (with $p \neq 0$). It obviously reminds of a binomial series, but I can't progress from this spot. Any help would be appricated.

1

There are 1 best solutions below

0
On

By Jensen inequality, $$\mathbb E\left(\frac{n}{\sum_{i=1}^{n} x_i}\right)>\frac{n}{\mathbb E(\sum_{i=1}^{n} x_i)}=p$$ So this estimator cannot be unbiased.

The exact value is $$ \mathbb E(T(\underline{x})) = n\sum_{i=n}^{\infty}\frac{1}{i}\binom{i-1}{n-1}(1-p)^{i-n}p^n $$ and this value leads to a hypergeometric function and does not have a simple form.

Form the other side, if you consider $\frac{n}{\sum_{i=1}^n x_i -1}$ instead of $T(\underline x)$, then the expected value can be found: $$ \mathbb E\left(\frac{n}{\sum_{i=1}^n x_i -1}\right) = n\sum_{i=n}^{\infty}\frac{1}{i-1}\binom{i-1}{n-1}(1-p)^{i-n}p^n $$ $$= \frac{n}{n-1}p \sum_{i=n}^\infty \binom{i-2}{n-2}p^{n-1}(1-p)^{i-n} = \frac{n}{n-1}p, $$ since the last sum equals to $1$ as the sum of probabilities of all possible values of $NegBinom (n-1,p)$ distribution: $$ 1=\sum_{j=n-1}^\infty \binom{j-1}{n-2}p^{n-1}(1-p)^{j-n+1}=\bigl[i=j+1\bigr]=\sum_{i=n}^\infty \binom{i-2}{n-2}p^{n-1}(1-p)^{i-n}. $$ And from $\mathbb E\left(\frac{n}{{\sum_{i=1}^n x_i} -1}\right) = \frac{n}{n-1}p$ you can design unbiased estimate if needed.