Coming to the concept of minimum variance for an estimator

34 Views Asked by At

Our interest is to estimate the parametric function $g(\theta)$ such that the estimator is as good as possible.Suppose $T(X)$ be an estimator of $g(\theta)$.We would expect that $T$ has greater concentration about $g(\theta)$ than any other rival estimator $T'$ of $g(\theta)$, that is $P[|T-g(\theta)| < \epsilon]>P[|T'-g(\theta)|< \epsilon]$----(#), where $\epsilon >0 $ is an arbitrary small quantity. By Chebyshev's inequality, $P[|T-g(\theta)| < \epsilon] \ge 1- \frac{E(T-g(\theta))^2}{\epsilon^2}$ and $P[|T'-g(\theta)| < \epsilon] \ge 1- \frac{E(T'-g(\theta))^2}{\epsilon^2}$ So, the sufficient condition for holding (#) is $E(T-g(\theta))^2\le E(T'-g(\theta))^2$ I am confused how, this is a sufficient condition for holding (#)?. Is this a correct way to proceed anyway to bring in the concept of minimum variance to justify the goodness of an estimator? Or is there any other way out?