Find an estimator of $\theta$ using the method of moments; call it $\hat \theta$. Prove $\hat \theta$ is a consistent estimator of $\theta$

1.2k Views Asked by At

A random sample of size $n$, $Y_1, Y_2, \ldots, Y_n$ is drawn from a population. The population distribution has the following probability density function

$f(y;\theta)= \begin{cases} \sqrt{\theta}y^{\sqrt{\theta}-1}, & \text{when 0 $\le$ y $\le$ 1} \\ 0, & \text{elsewhere} \end{cases}$

Find an estimator of $\theta$ using the method of moments; call it $\hat \theta$. Prove $\hat \theta$ is a consistent estimator of $\theta$

What I have done so far:

First I opted to find $\mu$. So I set up my integral and solved: $$\mu = \int_0^1 \sqrt{\theta}y^{\sqrt{\theta}}dy = \frac{\sqrt{\theta}}{\sqrt{\theta}+1}$$

Then I solved the equation for $\theta$ to get $\theta = \frac{\mu^2}{(\mu-1)^2}$. So I let the estimator be $\hat{\theta} = \frac{\bar{Y^2}}{(\bar{Y}-1)^2}$

My trouble is coming from how I will prove $\hat{\theta}$ is consistent. I know two ways to show something is consistent, but I am not entirely sure how to apply that in this case. I know if:

1) $\lim \limits_{n \to \infty}E(\hat{\theta})= \theta$ and $\lim \limits_{n \to \infty}V({\hat{\theta}}) = 0$

or

2) $\lim \limits_{n \to \infty}P(|\hat{\theta_n} - \theta| \le \epsilon)=1$ or $\lim \limits_{n \to \infty}P(|\hat{\theta_n} - \theta| \gt \epsilon)= 0$

Then $\hat{\theta}$ is a consistent estimator for $\theta$. I'm just struggling to make use of those definitions. I'm not sure if my estimator is incorrect, or if I'm forgetting something, but any help would be greatly appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

You can combine the (weak) law of large numbers and the continuous mapping theorem. Since $\bar{Y}_n$ converges to $\mu = \dfrac{\sqrt{\theta}}{\sqrt{\theta}+1}$ and the function $g(y) = \dfrac{y^2}{(y-1)^2}$ is continuous on the unit interval $(0,1)$, what can you conclude about $\hat{\theta}_n = g(\bar{Y}_n)$?

Note: The above convergence results are meant to take place in probability, which is what you need to show consistency.