Consistency implies that bias & variance vanish?

89 Views Asked by At

One can easily show using Markov's inequality that when Bias and Variance vanish, then an estimator is consistent with respect to the parameter of interest. Yet, I wonder why standard text books do not claim the reverse: If an estimator is consistent, both bias and variance must vanish. This should be trivial or is my intuition wrong?

1

There are 1 best solutions below

0
On BEST ANSWER

There exist consistent estimators whose bias and variance do not vanish as sample size goes to infinity.

Indeed, let $X\sim \mathcal N (\mu,\sigma^2)$ and consider the problem of estimating $\mu$ with $n$ i.i.d. copies of $X$ that we denote $X_1,\ldots,X_n$, and let $\bar X_n := \sum_{i=1}^n X_i/n$ be the sample mean.
Next, let $(Y_n)$ be any sequence of random variables such that for all $n$, $Y_n$ is independent of $\bar X_n$, $\mathbb E[Y_n] = 0$ and $\text{Var} [Y_n] \ge c^2$ for some constant $c>0$.

It is easy to see that $Z_n := \bar X_n + Y_n$ is a consistent estimator of $\mu$, however it's variance is lower bounded by $c^2$ and thus $\text{Var} [Z_n] \not\to 0$ as $n\to\infty$.

Similarly, this Wikipedia article gives an example of a sequence of consistent estimators whose bias stays uniformly bounded away from zero.

I suggest you give a read to this excellent thread on stats.SE where the relationship between consistency and biasedness is discussed with many illuminating examples.