Concluding consistency of estimators

30 Views Asked by At

Say we have a set of $n$ iid rvs with variance $\sigma^{2}$ and an estimator T of some parameter $\theta$. If we know that $Var(T) = {\sigma^{2}\over n}$, is that enough for us to conclude that our estimator is consistent (as supposing $T=\bar X$ is our estimator of the population mean $\mu$, we then have $\mathbb{E}(T) = \mathbb{E}(\bar X) = \mu$)?

1

There are 1 best solutions below

2
On BEST ANSWER

Yes, because of the Chebyshev's inequality, provided that $T$ is unbiased. In fact we have $\forall \epsilon >0$ $$ P(|T-\theta|>\epsilon)\leq\frac{var(T)}{\epsilon^2}=\frac{\sigma^2}{\epsilon^2n} $$ which goes to $0$.

Note that in the case of $\bar{X}$ as an estimator of the mean $\mu$ you can get the same conclusion from the weak law of big numbers.