A chemist wants to decide the amount of a certain substance $\mu$ in a specific type of food. In the lab, the chemist has two measuring intruments $A$ and $B$. The results from the instruments can be assumed to be independent of eachother.
Measuring results from instrument $A$: $X_1, X_2, ..., X_n$
Measuring results from instrument $B: Y_1, Y_2, ..., Y_n$.
The standard deviation for the results from $A$ and $B$ are $\sigma_X = \sigma_Y = 1.6$. The expected value $\mu$ is also equal for both of them, but it is unknown. The chemist uses to following estimator for $\mu$:
$\hat \mu =a\bar{X}+b\bar{Y} = \frac{a}{n} \sum\limits_{i=1}^{n} X_i + \frac{b}{m} \sum\limits_{i=1}^{m} Y_i$
where $a$ and $b$ are positive constants the chemist has to choose depending on $n$ and $m$.
Assume the chemist for the same food test does $n=12$ measurements with instrument $A$ and $m=18$ measurements with instrument $B$. Find values for $a$ and $b$ such that the estimator $\hat \mu$ will be both an unbiased estimator and have as low variance as possible. What is the value of $a$?
My attempt: $E(\bar{X}) = E(\frac{1}{n}\sum\limits_{i=1}^{n}) = \mu_X$
$Var(\bar{X}) = Var(\frac{1}{n}\sum\limits_{i=1}^{n}) = \frac{1}{n^2}\sum\limits_{i=1}^{n} Var(X_i) = \frac{\sigma_x^2}{n}$
So my idea is that since we have $a$ and $b$ instead of $1$ in the sums in the task, the formulas will be identical except for an added $a^2$ and $b^2$ in the variance. I want to use the variance, because the expected value $\mu$ is not given.
So I set up the following equation: $\frac{1}{12}(a^2*1.6^2)=1.6^2$ and $\frac{1}{18}(b^2*1.6^2)=1.6^2$.
This is a part of an online assignment so I can check the answer immediatly. I solved this in wolfram giving some different values for $a$ and $b$, none of which are correct however (one solution is $a=3.464, b = 4.243$). What have I done wrong?
We have to minimize \begin{align}\label{eq} Var(a\overline{X}+b\overline{Y})&=a^{2}Var(\overline{X})+b^{2}Var(\overline{Y})\\ &=a^{2}\frac{\sigma_{X}^{2}}{n}+b^{2}\frac{\sigma_{Y}^{2}}{m}, \end{align} where $\sigma_{X},\sigma_{Y},m,n$ are known. The constraint that $a\overline{X}+b\overline{Y}$ must be unbiased gives $$\mu=E(a\overline{X}+b\overline{Y})=a\mu+b\mu=(a+b)\mu,$$ so that $a+b=1$. So we can set $b=1-a$ in the above expression for the variance. You then have an expression in one variable $a$, and you can minimize it by calculating the zero of the deriative with respect to $a$.