How to calculate the limit of the variance of the moment estimate of the geometric distribution?

158 Views Asked by At

Let $\overline X$ be the sample variance, so that a low-order moment estimate of the geometric distribution can be obtained:$\;\hat{p}=1/\overline X$. I want to verify whether this estimator is an unbiased estimator and a consistent estimator for the parameter $p$. By calculation, the mathematical expectation of $p$ is: $$ E(\hat{p})=np^n \sum_{k=n}^{\infty} \frac1k \binom{k-1}{n-1} (1-p)^{k-n} $$ We can get $\lim\limits_{n\rightarrow\infty} E(\hat{p})=p$, it means that $\hat{p}$ is an asymptotically unbiased estimator of $p$.(How to calculate this limit related to hypergeometric functions)

Now I want to verify whether $\hat{p}$ is a consistent estimator of $p$, just verify $\lim\limits_{n\rightarrow\infty} D(\hat{p})=0$. Considering $D(\hat{p})=E(\hat{p}^2)-E^2(\hat{p})$, we only need to calculate $\lim\limits_{n\rightarrow\infty} E(\hat{p}^2)$. We know that $$ E(\hat{p}^2)=n^2p^n\sum_{k=n}^{\infty} \frac{1}{k^2} \binom{k-1}{n-1} (1-p)^{k-n} $$ Note that $$\frac{1}{k^2}=\int_0^1 x^{k-1} (-\log x)\mathrm{d}x$$ Let $a=(1-p)/p$, $\,z=1-p$, we have $$ \begin{aligned} \sum_{k=n}^{\infty} \binom{k-1}{n-1} \frac{z^k}{k^2} &=\int_0^1 \sum_{k=n}^{\infty} \binom{k-1}{n-1} x^{k-1} z^k (-\log x)\mathrm{d}x \\ &=\int_0^z \sum_{k=n}^{\infty} \binom{k-1}{n-1} t^{k-1} (\log z-\log t)\mathrm{d}t\quad(t=zx) \\ &=\int_0^z \frac{t^{n-1}}{(1-t)^n}(\log z-\log t) \mathrm{d}t \\ &=\int_0^{z/(1-z)} \frac{y^{n-1}}{1+y} \left(\log z-\log \frac{y}{1+y}\right) \mathrm{d}y \quad(t=\frac{y}{1+y}) \\ &=\int_0^a \frac{y^{n-1}}{1+y} \log \frac{z(1+y)}{y} \mathrm{d}y \end{aligned} $$ So we can obtain that $$ \lim_{n\rightarrow\infty} E(\hat{p}^2)=\lim_{n\rightarrow\infty}\frac{n^2}{a^n} \int_0^a \frac{y^{n-1}}{1+y} \log \frac{z(1+y)}{y} \mathrm{d}y $$ But I can't solve it... Thank you in advance for your help!

1

There are 1 best solutions below

2
On BEST ANSWER

Substitute $y=ax^{1/n}$ (as I've suggested in a now-deleted comment). The limit becomes $$L=\lim_{n\to\infty}\int_0^1\left(n\log\frac{1+ax^{1/n}}{1+a}-\log x\right)\frac{\mathrm{d}x}{1+ax^{1/n}},$$ and we have dominated convergence here, allowing $\lim\leftrightarrow\int$. Since $$\lim_{n\to\infty}n\log\frac{1+ax^{1/n}}{1+a}=\lim_{n\to\infty}n\log\left(1-\frac{a}{1+a}(1-x^{1/n})\right)\\=-\frac{a}{1+a}\lim_{n\to\infty}n(1-x^{1/n})=\frac{a}{1+a}\log x,$$ we obtain $L=(1+a)^{-2}\int_0^1(-\log x)\,\mathrm{d}x=(1+a)^{-2}=p^2.$