Comparing variances for estimators

329 Views Asked by At

Let $\tilde \beta$ and $\hat \beta$ be two different estimators of a parameter $\beta$ with $E(\tilde \beta)=\beta$ and $E(\hat \beta)=\beta$. I want to determine which estimator is superior. Since both are unbiased, I look to their variances:

$Var(\tilde \beta)=\frac{\sigma ^2 \sum_{i=1}^n\frac{1}{X_{i}^2}}{n^2}$

$Var(\hat \beta)=\frac{\sigma^2}{\sum_{i=1}^n X_{i}^2}$

I tried $Var(\hat \beta)-Var(\tilde \beta)$ as well as $\frac{Var(\hat \beta)}{Var(\tilde \beta)}$ but can't seem to make anything out of them. Any tips?

3

There are 3 best solutions below

3
On BEST ANSWER

$\text{Var}(\tilde{\beta}) = \frac{\sigma^2}{n}\text{E}[\frac{1}{X^2}]$

$\text{Var}(\hat{\beta}) = \frac{\sigma^2}{n}\frac{1}{\text{E}[X^2]}$

Applying Jensen's inequality, $\text{Var}(\hat{\beta}) \le \text{Var}(\tilde{\beta})$

0
On

Your question boils down to: if $x_i$ are positive numbers, which is bigger, $\frac{1}{n^2} \sum_{i=1}^n \frac{1}{x_i}$, or $\frac{1}{\sum_{i=1}^n x_i}$? Well, we know that $f(x)=1/x$ is convex for $x>0$ (its second derivative is $2/x^3>0$), so it follows that $\frac{1}{\sum_i c_i x_i} \leq \sum_i c_i \frac{1}{x_i}$ whenever $c_i$ are nonnegative numbers summing to $1$ and $x_i$ are positive numbers. Take $c_i=\frac{1}{n}$ and see what you conclude.

This can be regarded as a case of Jensen's inequality, where we are comparing $E[f(X)]$ and $f(E[X])$ where $f(x)=1/x$ and $X$ is a random variable which is uniformly distributed on the set $\{ x_1,x_2,\dots,x_n \}$.

2
On

The $p^{\rm th}$ power mean $$M_p(\boldsymbol x) = \left(\frac{1}{n} \sum_{i=1}^n x_i^p\right)^{1/p}$$ obeys the property $$p < q \implies M_p(\boldsymbol x) \le M_q(\boldsymbol x).$$ Thus, for $p = -1 < 1 = q$, we have $$M_{-1}(\boldsymbol x^2) \le M_1(\boldsymbol x^2),$$ or $$\frac{1}{n} \sum_{i=1}^n \frac{1}{x_i^2} \ge \left( \frac{1}{n} \sum_{i=1}^n x_i^2 \right)^{-1}.$$ Multiplying both sides by $\sigma^2/n$ gives $$\frac{\sigma^2}{n^2} \sum_{i=1}^n \frac{1}{x_i^2} \ge \sigma^2 \left(\sum_{i=1}^n x_i^2 \right)^{-1},$$ which can also be seen by considering Jensen's inequality.