Suppose that independent observations $X_{1}$ and $X_{2}$ are taken from Poisson $P(aλ)$ and Poisson $P(bλ)$ distributions respectively, where $a$ and $b$ are known and positive.
Here is the maximum likelihood estimator of $λ$:
$λ = \frac{X_{1} + X_{2}}{a + b}$
(ii) Compare the sampling distribution of the maximum likelihood estimator with the sampling distributions of the estimators:
$T_{1} = \frac{X_{1} - X_{2}}{a - b}$
$T_{2} = \frac{1}{2} \Bigg(\frac{X_{1}}{a} + \frac{X_{2}}{b} \bigg)$
and hence recommend an estimator for $λ$.
My Solution:
$T_{2}$ is a better estimator for the following reasons:
When $a = b$ the maximum likelihood estimator for $λ$ becomes $T_{2}$.
Since, $\frac{d^{2}L(λ)}{dλ^{2}} = -\frac{X_{1}X_{2}}{λ} < 0$ demonstrates that we are at a maximum only when $λ$ is positive and $T_{2}$ could be a negative if $b > a$ or $X_{2} > X_{1}$.
Please may someone help me on this question? I suspect that my answer in incorrect
Determining the best estimator depends on the desired properties that you would want good estimators to have. A common way to compare estimators is by comparing their bias and variance. Note that $$E[\frac{X_1+X_2}{a+b}] = \frac{1}{a+b}(a\lambda + b\lambda) = \lambda,$$ $$E[\frac{X_1-X_2}{a-b}] = \frac{1}{a-b}(a\lambda - b\lambda) = \lambda$$ and $$E[\frac12 (\frac{X_1}{a}+\frac{X_2}{b}) = \frac12 (\frac{a\lambda}{a}+\frac{b\lambda}{b}) = \lambda.$$ This means that all estimators are unbiased (with the exception of $T_1$ in the case $a=b$). We can thus not determine the best estimator in terms of bias alone, and we will thus proceed to compare variances. Note that $$\operatorname{Var}(\frac{X_1+X_2}{a+b}) =\frac{\operatorname{Var}(X_1)+\operatorname{Var}(X_2)}{(a+b)^2}=\frac{a\lambda+b\lambda}{(a+b)^2} = \frac{\lambda}{a+b},$$ $$\operatorname{Var}(\frac{X_1-X_2}{a-b}) =\frac{\operatorname{Var}(X_1)+\operatorname{Var}(X_2)}{(a-b)^2}=\frac{a\lambda+b\lambda}{(a-b)^2} = \frac{(a+b)\lambda}{(a-b)^2}$$ and \begin{align*}\operatorname{Var}(\frac12 (\frac{X_1}{a}+\frac{X_2}{b})) &=\frac{1}{4}(\frac{\operatorname{Var}(X_1)}{a^2} + \frac{\operatorname{Var}(X_2)}{b^2}) \\ &= \frac{1}{4}(\frac{\lambda}{a}+\frac{\lambda}{b}) \\ &= \frac{(b+a)\lambda}{4ab} .\end{align*} Note that determining the estimator with smallest variance is thus a question of determining, which of the expessions $(a+b)^2$,$(a-b)^2$ and $4ab$ is largest. Note that $$(a-b)^2 + 4ab = a^2 + b^2 +2ab = (a+b)^2,$$ so $(a+b)^2$ is largest, which means that the maximum likelihood estimator has lowest variance of the three estimators, and will thus be the best estimator.