I have three different unbiased estimators for the parameter $\theta$, and I want to find which one has the minimum variance. I was told that the third one has the minimum variance, but can you please explain to me why?
$\#1: \hat{\theta} = \frac{1}{n}\sum\frac{Y_i}{x_i}$
$\#2: \hat{\theta}=\frac{\sum Y_i}{\sum x_i}$
$\#3: \hat{\theta} = \frac{\sum x_i Y_i}{\sum x_i^2}$
Suppose that each $Y_i$ is normally distributed with mean $\beta x_i$ and variance $\sigma^2$. Suppose also that all $Y_i$'s are independent random variables.
*By calculating the variance of each of the above estimators, I obtained the following results (I found that the first estimator has the minimum variance, but it should be the third not the first one. So how it is the third one?):
$\#1: V(\hat{\theta}) = \frac{\sigma^2}{n \sum x_i^2}$
$\#2: V(\hat{\theta}) = \frac{n\sigma^2}{\sum x_i^2}$
$\#3: V(\hat{\theta}) = \frac{\sigma^2}{\sum x_i^2}$
Any help will be very appreciated.
Here are the variances:
$Var(\hat{\theta}_1) = \frac{\sigma^2}{n^2}\sum\frac{1}{x_t^2}$
$Var(\hat{\theta}_2) = \frac{n\sigma^2}{(\sum x_t)^2}$
$Var(\hat{\theta}_3) = \frac{\sigma^2\sum x_t^2}{(\sum x_t^2)^2} = \frac{\sigma^2}{\sum x_t^2}$
$(3)\leq(2)$ through an application of Cauchy Schwarz. Let me know if you want to know more about this.
For $(3) \leq (1)$, note that it is equivalent to asking that $n \leq (\sum x_t^2)^\frac{1}{2}(\sum \frac{1}{x_t^2})^\frac{1}{2}$. You can of course do this by saying $n = \sum \frac{x_t}{x_t}$ and using Cauchy Schwarz again.