How can I prove that this is an unbiased estimator?

525 Views Asked by At

Let $S$ is the sum of $n$ geometric Random Variables, with parameter $p$. I have deduced that $S$ has negative binomial distribution and parameters $n$ and $p$. How could I deduce that an estimator of $p$, $\frac{(n-1)}{(S-1)}$ is unbiased?

Thanks in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

We have $P(X_1=k)=p(1-p)^{k-1}$, $k=1,2,\ldots$ and $S_n=X_1+\ldots+X_n$ with pmf
$$ \mathbb P(S_n=k)=\binom{k-1}{n-1}p^n(1-p)^{k-n}, \ k\geq n. $$ Then $$ \mathbb E\left(\dfrac{n-1}{S_n-1}\right)=\sum_{k=n}^\infty\dfrac{n-1}{k-1}\binom{k-1}{n-1}p^n(1-p)^{k-n} = \sum_{k=n}^\infty\dfrac{n-1}{k-1}\dfrac{(k-1)!}{(n-1)!(k-n)!}p^n(1-p)^{k-n} $$ Reduce $(n-1)$ and $(k-1)$: $$ \mathbb E\left(\dfrac{n-1}{S_n-1}\right)=\sum_{k=n}^\infty\dfrac{(k-2)!}{(n-2)!(k-n)!}p^n(1-p)^{k-n}=p \sum_{k=n}^\infty\binom{k-2}{n-2}p^{n-1}(1-p)^{k-n}=p. $$ Last sum is equal to $1$ as the sum of probabilities for negative binomial distribution with parameters $n-1$ and $p$: $$ \sum_{k=n}^\infty\binom{k-2}{n-2}p^{n-1}(1-p)^{k-n} = \sum_{k-1=n-1}^\infty\binom{k-2}{n-2}p^{n-1}(1-p)^{k-n}=\sum_{m=n-1}^\infty\binom{m-1}{n-2}p^{n-1}(1-p)^{m-(n-1)}=\sum_{m=n-1}^\infty \mathbb P(S_{n-1}=m). $$