efficiency of estimator, $\overline{\frac{1}{X^2}}$ vs $\frac{1}{\overline{X}^2}$ vs $\frac{1}{\overline{X^2}}$

62 Views Asked by At

I was studying point estimator, and I tried to compare the variances of the estimators to find out which one is more efficient. (Hogg, Tanis "Probability and Statistical Inference" Ch.6)

It was a regression model with $Y_i=\theta x_i+e_i$ ($x_i$ is a constant, and $e_i$ is a random variable with $e_i$~$N(0,\sigma^2)$)

The variances of the estimators $\hat \theta$ are

  1. $\left(\frac{\sigma^2}{\sum_{i=1}^n x_i^2}\right)=\left(\frac{\sigma^2}{n \overline{X^2}}\right)$
  2. $\left(\frac{n\sigma^2}{(\sum_{i=1}^n x_i)^2}\right)=\left(\frac{\sigma^2}{n \bar X^2}\right)$
  3. $\left(\frac{\sigma^2}{n^2}\sum_{i=1}^n\frac{1}{x_i^2}\right)=\left(\frac{\sigma^2}{n} \overline {\frac{1}{X^2}}\right)$

I could easily figure out that var of 1)<var of 2) from $\sum(X_i- \bar X)^2=\sum X_i^2-n\bar X^2$

The tricky part is 3). How can I compare $\overline{\frac{1}{X^2}}$ with the others?

I could not find a similar question. If there is, please let me know.

1

There are 1 best solutions below

1
On BEST ANSWER

If you take the generalised or power means of positive numbers $\left(\sum x_i^p\right)^{1/p}$ then this increases with $p$ on the reals (using the geometric mean when $p=0$) and this can be extended to the means of positive random variables.

So $\left(\overline {X^2}\right)^{1/2} \ge \overline {X}^1 \ge \left(\overline {X^{-2}}\right)^{-1/2}$

Squaring this, then taking the reciprocal (which reverses the inequality) and multiplying by $\frac{\sigma^2}{n}$, tells you that $(1) \le (2) \le (3)$.

If $X$ can take negative values, then the relationship between $(2)$ and $(3)$ can be different.