Let $X_1, X_2, \ldots, X_n$ independent and identically distributed random variables, $X_i \sim U(-\theta, \theta)$ where $\theta > 0$

125 Views Asked by At

a) Determine method of moments estimator for $\theta$

b) Prove that the estimator found in a) is consistent.

Solution

a) $\hat{\theta} = \sqrt{\frac{3 \sum_{i=1}^{n} X_i^2}{n}}$

but I don't know how to prove b). I couldn't prove that the estimator is asymptotically unbiased. Any help? Thanks.

2

There are 2 best solutions below

2
On BEST ANSWER

Note that unbiasedness is neither necessary nor sufficient for consistency. For instance, if $X_i$ are iid $ N(\theta,1)$, then $X_1$ is an unbiased but inconsistent estimator for $\theta$. Meanwhile, $\frac{1}{n}\left(1+\sum_{i=1}^n X_i\right)$ is a biased but consistent estimator for $\theta$.

Also, in terms of your updated post, asymptotic unbiasedness doesn't guarantee consistency by the same counterexample above. See this post for more general discussion.


Coming to your question, WLLN and CMT immediately tell you that your estimator converges in probability to $\sqrt{3E[X_1^2]}=\theta.$

0
On

It is not necessarily to prove $\hat\theta_n$ is unbiased since we know the following theorems.

Thm 1 $\hat\theta_n \to \theta$ if $E \hat\theta_n \to \theta$ and $Var\hat\theta_n \to 0$.

Thm 2 $g(\hat\theta_n)\to g(\theta)$ if $\hat\theta_n\to\theta$ and $g(x)$ is continuous.

Then it suffices to prove $E\hat{\theta}^2_n \to \theta^2$ and $Var \hat{\theta}_n^2 \to 0$.

For $E\hat\theta^2_n$, $$ E\hat\theta^2_n = E \frac{3\sum X_i^2}{n} = 3EX_1^2 = \theta^2 $$

For $Var \hat\theta^2_n$, $$ Var \hat\theta^2_n = Var \frac{3\sum X_i^2}{n} = \frac9n Var X_1^2 \to 0 $$ as desired.