Suppose you have $X_1\sim \operatorname{Poisson}(\lambda p)$ and $X_2\sim \operatorname{Poisson}(\lambda(1-p))$ and $p$ is some known number in $(0,1).$ Derive the sufficient statistic for $\lambda$ based on the data, $X_1, X_2$.
We can find the Poisson PMF of $X_1$ and $X_2$ as follows:
$$P(X_1 = x_1, X_2 = x_2) = \frac{e^{-\lambda p}(\lambda p)^{x_1}}{x_1!} \frac{e^{-\lambda (1-p)}(\lambda (1-p))^{x_2}}{x_2!} = \frac{e^{-\lambda} (\lambda - \lambda p)^{x_2} (\lambda p)^{x_1} }{(x_1)!(x_2)!}$$ However, from there, I am lost. Can anyone offer some help/advice?
\begin{align} & \frac{e^{-\lambda p}(\lambda p)^{x_1}}{x_1!} \cdot \frac{e^{-\lambda (1-p)}(\lambda (1-p))^{x_2}}{x_2!} \\[10pt] = {} & \underbrace{\frac 1 {x_1! x_2!} p^{x_1} (1-p)^{x_2}}_{\large\text{A}\vphantom{\frac11}} \,\cdot\, \underbrace{e^{-\lambda} \lambda^{x_1+x_2}}_{\large\text{B}\vphantom{\frac11}} \end{align} The factor A does not depend on $\lambda.$ The factor B depends on $x_1,x_2$ only through their sum.
So Fisher's factorization theorem is applicable.
The fact that $p$ is known (which means you're dealing with a family of distributions indexed by $\lambda$ with $p$ fixed) matters here, since it makes it possible to include the part depending on $p$ within A rather than within B. If we needed to estimate $p$ then we would need more information than $x_1+x_2;$ we would need the values of both.