The PDF of $X$ is $\frac{1+\theta x}{2}$ for $x\in [-1,1], \theta \in[-1,1]$. Is the Method of Moments estimator of $\theta$ consistent?

96 Views Asked by At

We know that the method of moments estimator, $\hat{\theta}_n(\textbf{X})$, is consistent if

$$\lim_{n\to \infty}\mathbb{P}(|\hat{\theta}_n(\textbf{X})-\theta|>\epsilon)=0$$.

We know that $\mathbb{E}(X;\theta)=\frac{1}{2}\int_{-1}^1(x+\theta x^2) dx=\frac{\theta}{3}$, so $\hat{\theta}_n(\textbf{X})=3\bar{X}$.

Therefore $\lim_{n\to \infty}\mathbb{P}(|\hat{\theta}_n(\textbf{X})-\theta|>\epsilon)= \lim _{n\to \infty}\mathbb{P}(|3\bar{X}-\theta|>\epsilon) $.

At this point I'm not sure how to proceed? I've tried using Chebyshev's inequality but this doesn't lead anywhere. I know that if a sequence in random variables converges in distribution to a constant then that is equivalent to a sequence in random variables converging in probability to that constant. Now $\lim_{n\to \infty}\mathbb{P}(\hat{\theta}_n(\textbf{X})\leq x)= \lim_{n\to \infty}\mathbb{P}(3\bar{X}\leq x)$ and I don't think this equals $\mathbb{P}(\theta\leq x)$? So would I be right in saying the method of moments estimator is not consistent?

1

There are 1 best solutions below

4
On BEST ANSWER

You can easy use a necessary and sufficient condition for convergence in $L^2$ that is sufficient condition for consistency:

$$\mathbb{E}[\hat{\theta}]=\mathbb{E}[3\overline{X}_n]=3\cdot \frac{\theta}{3}=\theta$$

$$\mathbb{V}[\hat{\theta}]=\mathbb{V}[3\overline{X}_n]=\frac{9\sigma^2}{n}=\frac{3-\theta^2}{n}$$

Now being

$$\mathbb{E}[\hat{\theta}]=\theta$$

and

$$\lim\limits_{n \to \infty}\mathbb{V}[\hat{\theta}]=0$$

the estimator is consistent


Of course applying Chebishev's inequality the result is the same

$$\lim\limits_{n\to \infty}\mathbb{P}\Big\{\Big|3\overline{X}_n-\theta\Big|<\epsilon\Big\}\geq 1-\frac{3-\theta^2}{n \epsilon^2}=1$$