The question is: $x_i = \alpha + \omega_i, $ for $i = 1, \ldots, n.$
where $\alpha$ is a non-zero constant, but unknown, parameter to be estimated, and $\omega_i$ are uncorrelated, zero_mean, Gaussian random variable with known variance $\sigma_i^2$. Note that $\sigma_i^2$ and $\sigma_j^2$, for $i \neq j$, may be distinct. We wish to estimate $\alpha$ from a weighted sum of $x_i$, i.e.
$$\hat{\alpha} = \sum^n_{i=1}b_ix_i$$
Determine $b_i$, $i= 1, \ldots, n$, such that $\hat{\alpha}$ is unbiased and the variance of $\hat{\alpha}$ is as small as possible.
I have tried to use the unbiased condition and get that: $\sum_{i=1}^nb_i = 1$
I don't know how to use the variance of $\hat{\alpha}$ is as small as possible condition.
The weights should be proportional to the reciprocals of the variances: $$ b_k = \frac{1/\sigma_k^2}{\sum_{i=1}^n 1/\sigma_i^2}.\tag1 $$ This can be shown with Lagrange multipliers.
The variance of $\sum_{i=1}^n b_i x_i$ is $\sum_{i=1}^n b_i^2\sigma_i^2$. The problem is to minimize that subject to the constraint $\sum_{i=1}^n b_i=1$. The $i$th component of the gradient of the thing to be minimized is $2b_i\sigma_i^2$. The vector whose components are those has to be a scalar multiple of the gradient of the function $\sum_{i=1}^n b_i$. Notice that that happens with the weights in $(1)$.