I stumbled upon the following problem in my research. We are trying to analyze $Z=\min(X,Y)$ where $X \sim Pois(p\lambda)$ and $Y\sim Pois((1-p)\lambda)$. Note that the RVs expectation is related yet not identical but are independent.
What we are most interested in is a closed form expression for $\mathbb{E}Z$. Or, alternatively, an expression simple enough to prove with that the expectation $\mathbb{E}Z$ is attained at $p=\frac{1}{2}$
I managed to find very little literature on the subject. I saw that in some places this scenario is called a "Poisson Race", but couldn't find anything that is relevant to me.
I tried to go the manual way: \begin{equation} \begin{split} \mathbb{E} Z & = \sum_{n\geq 1} \Pr(min(X,Y) \geq n) \\ & = \sum_{n\geq 1} \Pr(X\geq n\ \text{and}\ Y\geq n) \\ & = \sum_{n\geq 1} \Pr(X\geq n)\cdot \Pr(Y\geq n) \\ & = \sum_{n\geq 1}\Bigg[\Bigg(\sum_{i\geq n} \frac{(p \lambda)^i e^{-p\lambda}}{i!} \Bigg)\Bigg(\sum_{i\geq n} \frac{((1-p) \lambda)^i e^{-(1-p)\lambda}}{i!} \Bigg)\Bigg] \\ & = e^{-\lambda}\sum_{n\geq 1}\Bigg[\Bigg(\sum_{i\geq n} \frac{(p \lambda)^i}{i!} \Bigg)\Bigg(\sum_{i\geq n} \frac{((1-p) \lambda)^i }{i!} \Bigg)\Bigg] \\ & = e^{-\lambda}\sum_{n\geq 1}\Bigg[\Bigg(e^x-e_{n-1}(p\lambda) \Bigg)\Bigg(e^x - e_{n-1}((1-p)\lambda) \Bigg)\Bigg] \\ \end{split} \end{equation}
But this didn't lead to any relatively simple terms. Tried looking into Gamma Taylor partial sums of $e^x$ and Gamma functions $\Gamma (x)$ but again, with no result.
What is obvious, due to the symmetry of the function is that the max is attained at $p=\frac{1}{2}$. Does one see any way to prove so without having to derive once and twice and do all the dirty work?
$e_n(x)$ is the Exponential Sum Function

I couldn't find (by math or by Google) any closed form for the expectation, but here is some partial progress.
If $X, Y$ are Poisson with rates $\lambda p$ and $\lambda(1-p)$, then $X+Y$ is Poisson with rate $\lambda$, and when we condition on the value of $X+Y$, we have: $$\Pr[X = k \mid X+Y = n] = \binom nk p^k (1-p)^{n-k}$$ (In other words, $X$ and $Y$ are binomial when $X+Y$ is fixed to $n$.)
It will probably be easier to show that $\mathbb E[Z \mid X+Y=n]$ is maximized when $p = \frac12$, and conclude that $\mathbb E[Z]$ is also maximized when $p = \frac12$, than to deal with $\mathbb E[Z]$ directly. But even this isn't as easy as I thought when I wrote this answer...
I don't think the binomial observation will help you get exact values, but it can help with asymptotics, because for large $n$, you have the Chernoff bound to estimate how often the variable with the larger rate "loses the race" and becomes the minimum.