Question:
Let $X_1,……,X_n$ be an i.i.d random sample from the distribution with PDF $f(x;\theta)=\theta(1+\theta)x^{\theta-1}(1-x),0<x<1,\theta>0$
a)Find the method of moments estimator for $\theta$ based on the first moment of $X_i$
b)Show the estimator obtained in part(a) is consistent.Derive its asymptotic distribution."
Confusion:
It's easy to calculate that the answer for part should be $\hat{\theta}=\frac{2\overline{X_n}}{1-\overline{X_n}}$,the question is part b. To prove consistent, we must prove$\frac{2\overline{X_n}}{1-\overline{X_n}}\xrightarrow[]{p}\theta$,since only "$L_P$-Convergence" and "Almost Sure convergence" can led to "Convergence in Probability", but I've tried these two method, nothing. What's the correct method to solve this problem?
By WLLN $\bar{X}_n \xrightarrow{p} \mathbb{E}[X]$. Now, note that $g(x)=2/(1-x)$ is continuous function over $(0,1)$, thus by the continuous mapping theorem $$ \frac{2}{1-\bar{X}_n} \xrightarrow{p} \frac{2}{1-\mathbb{E}[X]}, $$ thus $$ \frac{2\bar{X}_n}{1-\bar{X}_n} \xrightarrow{p} \frac{2\mathbb{E}[X]}{1-\mathbb{E}[X]}. $$ So, you just have to calculate $$ \mathbb{E}[X]=\int_{(0,1)} xf_X(x;\theta)dx $$ and plug in the result.