estimation of a parameter

90 Views Asked by At

The question is: $x_i = \alpha + \omega_i, $ for $i = 1, \ldots, n.$

where $\alpha$ is a non-zero constant, but unknown, parameter to be estimated, and $\omega_i$ are uncorrelated, zero_mean, Gaussian random variable with known variance $\sigma_i^2$. Note that $\sigma_i^2$ and $\sigma_j^2$, for $i \neq j$, may be distinct. We wish to estimate $\alpha$ from a weighted sum of $x_i$, i.e.

$$\hat{\alpha} = \sum^n_{i=1}b_ix_i$$

Determine $b_i$, $i= 1, \ldots, n$, such that $\hat{\alpha}$ is unbiased and the variance of $\hat{\alpha}$ is as small as possible.

I have tried to use the unbiased condition and get that: $\sum_{i=1}^nb_i = 1$

I don't know how to use the variance of $\hat{\alpha}$ is as small as possible condition.

2

There are 2 best solutions below

3
On

The weights should be proportional to the reciprocals of the variances: $$ b_k = \frac{1/\sigma_k^2}{\sum_{i=1}^n 1/\sigma_i^2}.\tag1 $$ This can be shown with Lagrange multipliers.

The variance of $\sum_{i=1}^n b_i x_i$ is $\sum_{i=1}^n b_i^2\sigma_i^2$. The problem is to minimize that subject to the constraint $\sum_{i=1}^n b_i=1$. The $i$th component of the gradient of the thing to be minimized is $2b_i\sigma_i^2$. The vector whose components are those has to be a scalar multiple of the gradient of the function $\sum_{i=1}^n b_i$. Notice that that happens with the weights in $(1)$.

7
On

For the unbiasedness, we have

$$ E\left(\hat{\alpha}\right)=E\left(\sum_{i=1}^nb_ix_i\right)=E\left(\sum_{i=1}^nb_i(\alpha+\omega_i)\right)=\alpha\sum_{i=1}^nb_i + E\left(\sum_{i=1}^nb_i\omega_i\right)=\alpha\sum_{i=1}^nb_i $$ and we get that $\sum_{i=1}^nb_i=1$ as you say.

Now, what follows is to simply make this homoscedastic so that we can use the Gauss-Markov theorem. Divide through by $\sigma_i$:

$$ \frac{x_i}{\sigma_i}=\frac{\alpha}{\sigma_i}+\frac{\omega_i}{\sigma_i}\Rightarrow x_i^*=\alpha\frac{1}{\sigma_i}+\omega_i^* $$ where $\omega_i^*\sim N(0, 1)$ (stars indicate variance adjusted). This satisfies the usual OLS conditions, so by the Gauss-Markov theorem OLS is efficient and unbiased. The estimator then is:

$$ \hat{\alpha}=\arg\min_{a}\sum_{i=1}^n(x_i^*-a\frac{1}{\sigma_i})^2\Rightarrow-2\sum_{i=1}^n\frac{(x_i^*-a\frac{1}{\sigma_i})}{\sigma_i}=0\\ \sum_{i=1}^n\frac{x_i}{\sigma_i^2}=\sum_{i=1}^n\frac{a}{\sigma^2_i}\\ \sum_{i=1}^n\frac{x_i}{\sigma^2_i}=a\sum_{i=1}^n\frac{1}{\sigma^2_i}\\ \frac{\sum_{i=1}^n\frac{x_i}{\sigma^2_i}}{\sum_{i=1}^n\frac{1}{\sigma^2_i}}=a $$ so the weights are $$ b_i=\frac{\frac{1}{\sigma^2_i}}{\sum_{i=1}^n\frac{1}{\sigma^2_i}}. $$