How to show that $\bar X$ is a consistent estimator

707 Views Asked by At

Can someone give me an hint with this exercise,please? Let $X_1,\dots,X_n$ be i.i.d. from a population with p.d.f. $(x|\theta)$ =$\theta x^ {θ−1}$

$0<x<1$ ,$\theta>0$

Show that $\bar X$ is a consistent estimator of $\frac{\theta}{\theta+1}$

I know that to be consistent the limit of $n \to \infty$ of the MSE must equal $0$.

How can I find the bias and the variance in order to calculate the MSE? Any help would be very appreciated.

2

There are 2 best solutions below

4
On BEST ANSWER

Call your parameter $\gamma:=\frac{\theta}{\theta+1}$ and call your estimator $\hat\gamma:=\bar X$. You are trying to show MSE converges to zero, i.e.

$$E[\|\hat\gamma -\gamma \|^2]\rightarrow 0,$$

but this is equivalent to showing convergence in $L^2$. Note this convergence is stronger than consistency, so it is not generally true that the "limit ...of the MSE must be equal zero" for consistency to hold (it is sufficient, although not necessary for consistency).

Consistency, by definition, means convergence in probability, i.e.

$$\hat\gamma\overset{p}{\rightarrow }\gamma,$$

which in your case follows immediately by the weak law of large numbers once you show $E[\hat\gamma]=\gamma.$

2
On

Presumably $\overline{X}_n$ is the sample mean, that is $$\overline{X}_n=\frac1n\sum^n_{n=1}X_j$$ and the $(X_j:j\in\mathbb{N})$ are i.i.d. with common distribution $$P=[X\in dx]=\theta x^{\theta-1}\mathbb{1}_{(0,1]}(x)$$ In such case, $$E[\overline{X}_n]=E[X_1]=\frac{\theta}{\theta+1}$$ There are many ways to show that $\overline{X}_n$ is consistence, that is that $\overline{X}_n$ converges in probability to $\frac{\theta}{1+\theta}$. The quickest one is by the strong law of large numbers which states in fact that $\overline{X}_n$ converges almost surely and in $L_1$ to $E[X_1]=\frac{\theta}{\theta+1}$. Convergence almost surely will imply convergence in probability.

As for convergence in MSE, notice that the quadratic error $E[(\overline{X}_n-\tfrac{\theta}{\theta+1})^2]$ coincides in this case with the variance of $\overline{X}_n$; hence $$E[(\overline{X}_n-\tfrac{\theta}{\theta+1})^2]=\operatorname{var}(\overline{X}_n)=\frac{1}{n^2}n\operatorname{var}(X_1)=\frac{1}{n}\operatorname{var}(X_1)\xrightarrow{n\rightarrow\infty}0$$