Let $\ X_i \sim Pois(a \lambda), i = 1,\dots, n $ and $\ Y_i \sim Pois(\lambda) , i = 1, \dots ,m $
I want to find MLE of $\ \lambda $ .
$$\ L(\lambda; X , Y) = \prod_{i = 1}^n e^{-a \lambda} \cdot \frac{a\lambda^{x_i}}{x_i!} \ \prod_{i = 1}^m e^{-\lambda} \cdot \frac{\lambda^{x_i}}{y_i!} \\ = e^{-a\lambda n - \lambda m } \cdot a\lambda^{\sum^n x_i} \cdot \lambda^{\sum^m y_i} \prod^n (x_i!)^{-1} \prod^m (y_i !)^{-1} \\ l(\lambda ; X,Y) = -a\lambda n + \lambda m + \sum^n x_i \cdot \ln(a\lambda) + \sum^m y_i \cdot \ln(\lambda) + \sum^n\ln(x_i!)^{-1} + \sum^m \ln(y_i!)^{-1} \\ l'(\lambda; X,Y) = -an -m + \frac{\sum^n x_i + \sum^m y_i}{\lambda}$$
which leads to
$$\ \hat \lambda =\frac{\sum_n x_i + \sum_m y_i}{na + m}$$
but according to similar example I found in the book it should be
$$\ \hat \lambda = \frac{n+m}{a\sum^n x_i + \sum^m y_i}$$
where am I wrong here?
It's clearer to work with the kernel of the joint likelihood with respect to $\lambda$:
$$\mathcal L (\lambda \mid X, Y, a) \propto e^{-(an+m)\lambda} \lambda^{\sum^n x_i + \sum^m y_i}.$$ We omit every multiplicative factor that is not a function of $\lambda$. This gives the log-likelihood $$\ell(\lambda \mid X, Y, a) = -(an+m)\lambda + (n \bar x + m \bar y) \log \lambda,$$ hence the critical points satisfy $$0 = \frac{\partial \ell}{\partial \lambda} = -(an + m) + \frac{n\bar x + m \bar y}{\lambda},$$ or $$\hat \lambda \mid a = \frac{n \bar x + m \bar y}{an + m},$$ which is equivalent to your result. The other answer cannot possibly be correct, since if the sample total equals $0$, then $\hat \lambda$ is undefined.