Showing an estimator is inconsistent

1.2k Views Asked by At

I am giving a random sample $X_1,X_2,...,X_n$ from a population with exponential distribution, and I've shown that $\overline X$ is a consistent estimator of the mean $\theta$ of the population by showing $$\mathtt{E}[\overline X] = \theta$$ and $$\lim\limits_{n \to \infty}\mathtt{Var}(\overline X) = \lim\limits_{n \to \infty}(\frac{\theta^2}n) = 0.$$

My textbook tells me this is sufficient to show that $\overline X$ is a consistent estimator of $\theta$. In the next problem, however, I am asked whether $X_n$ is a consistent estimator of the parameter $\theta$. $$\mathtt{E}[X_n] = \theta$$ but $$\lim\limits_{n \to \infty}\mathtt{Var}(X_n) = \lim\limits_{n \to \infty}(\theta^2) \neq 0.$$ Thus I cannot conclude this is a consistent estimator, nor does the theorem my textbook provided to me let me conclude this is an inconsistent estimator.

2

There are 2 best solutions below

2
On BEST ANSWER

If $X_n$ is a consistent estimator of $\theta$, then by definition $$\forall \ c>0, \ \lim_{n\to \infty} P(|X_n-\theta|<c) = \ 1$$

$$P(|X_n-\theta|<c) = P(-c < X_n-\theta<c) = P(\theta-c<X_n<\theta+c)=e^{\frac{c}{\theta}-1}-e^{-\frac{c}{\theta}-1}$$

whose limit as $n\to\infty$ clearly is not $1$.

Thus the estimator is inconsistent.

0
On

An estimator should be unbiased and consistent. Unbiased means in the expectation it should be equal to the parameter. Consistent means if you have large enough samples the estimator converges to the parameter. Mathematically, $$ E[X_n]=\theta \implies \text{Estimator is unbiased} $$ $$ \lim_{n\to \infty} P(|X_n-\theta|<\epsilon) \ne 1 \implies \text{Estimator is inconsistent} $$ Please note that $X_n$ is a random variable and hence, we argue about its convergence.