Role of variance in consistent estimators

229 Views Asked by At

By definition, a consistent estimator or rather a weak consistent estimator is one that causes data points to converges to their true value as the number of data points increases. So naturally, bias which is defined as $||{E[\hat{\theta}]-\theta}||$ converges to $0$ as number of points $m \rightarrow \infty$. So, even variance is $0$ as $E[(\hat{\theta}-\theta)(\hat{\theta}-\theta)^{T}]$ will have $\hat{\theta}-\theta=0$.

However, I am still not too sure if I am thinking in the right direction due to the Kolmogorov SLLN as well as the no-free-lunch-theorem.

Just another question, since an unbiased estimator has bias=0 and hence, the points are all at their true places in the empirical distribution, then is it not the definition of "strong" consistency?

1

There are 1 best solutions below

1
On BEST ANSWER

Consistency does not imply that variance goes to 0. You can find counterexamples for this:

https://stats.stackexchange.com/questions/74047/why-dont-asymptotically-consistent-estimators-have-zero-variance-at-infinity