Consistency of Biased Estimators

51 Views Asked by At

In Statistical Inference, we were taught this theorem,

Consider an estimator $T_n$ of population parameter $\theta$, using $n$ samples. $T_n$ is a Consistent Estimator of $\theta$ if $$E[T_n] \to \theta \text{ and $Var(T_n) \to 0$}$$

However the converse of the above is not true. To show this, I made the following estimator of $\mu$. Let $X_1, X_2...X_n$ be random samples from a population with finite mean $\mu$ and varaice $\sigma^2$. Denote the sample mean by $\bar{X_n}$.

$$T_n = \bar{X_n} + n^2Y_n$$ Where $$Y_n \sim Ber(\frac{1}{n})$$ and $Y_i$'s are independent of the sample $X_1, X_2...X_n$. It's clear $E[T_n] = \mu + n$. Clearly, $E[T_n] \not\to \mu$. Using the Weak Law of Large numbers. $$(1 - \frac{\sigma^2}{n \epsilon^2}) \leq P(|\bar{X}_n - \mu| \leq \epsilon) \leq 1$$ Noting that $Y_n$ is $0$ with probability $(1 - \frac{1}{n})$, and in that case, $T_n | (Y_n = 0) =\bar{X}_n$. We can say

$$(1 - \frac{\sigma^2}{n \epsilon^2})(1 - \frac{1}{n}) \leq P(|T_n - \mu| \leq \epsilon) \leq 1$$ Taking the limit $n \to \infty$, we get $T_n$ is a consistent estimator of $\mu$.

Can someone produce an example purely using the samples (not like the $Y_n$ I used here, as I'm not sure how we'll generate the "randomness" $Y_n$ assures)?

1

There are 1 best solutions below

2
On

An easy example is to sample from a Pareto distribution with parameter $\alpha \in (1, 2]$. In this way the sample mean $\bar{X}_n$ is an unbiased and consistent estimator whose variance does not tend to $0$. Indeed:

  • By the (weak) Law of Large Numbers, since $\alpha > 1$, the sample mean converges in probability to the expected value. Hence the estimator is consistent
  • $\mathbb{E}[\bar{X}_n] = \mathbb{E}[X_1] = \theta$, i.e. the estimator is unbiased
  • $\text{Var}(\bar{X}_n) = \text{Var}(X_1)/n = \infty$, hence the variance does no converge to zero if $n \to \infty$