Suppose, that $X \sim \text{Pois}(\lambda_1)$ and $Y \sim \text{Pois}(\lambda_2)$, and consider the Wasserstein metric
$$ d_W(X,Y) = \sup_{h \in \text{Lip}(1)} \vert E[h(X)]-E[h(Y)] \vert, $$ where $\text{Lip}(1)$ denotes the real functions which are Lipschitz with constant atmost 1.
I want to prove the following upper bound: $$ d_W(X,Y) \leq \vert \lambda_1-\lambda_2\vert $$
I am not sure whether or not this holds for a more general class of distribution (if we replace the $\lambda$'s with expectations), and by extension whether or not we need to use properties of the Poisson distribution.
As a generic attempt, the taking an arbitrary $h \in \text{Lip}(1)$, we get
$$ \vert E[h(X)]-E[h(Y)] \vert \leq E[\vert h(X) - h(Y)\vert ] \leq E[\vert X - Y \vert ], $$ but this is not useful. We would need to not move the absolute value inside the integral, but then I am coming up blank for ways to do this estimate.
Also I tried googling for any nice results for the Lipschitz functions and the Poisson distribution but could not find anything. I guess we could try to use that the Poisson distribution is discrete, but I don't think sums are preferable to $E$ here.
The inequality you have already deduced is useful. Now it is a matter of coming up with the right coupling. You want to find a pair $(X,Y)$ such that the marginals are Poisson distributed with rates $\lambda_1$ and $\lambda_2$ and such that $$ E[\lvert X-Y\rvert]\leq\lvert\lambda_1-\lambda_2\rvert. $$ To do this we can use the fact that the sum of independent Poisson variables is again Poisson distributed.
Assume w.l.o.g. that $\lambda_1\geq\lambda_2$ and let $A\sim\text{Pois}(\lambda_2),B\sim\text{Pois}(\lambda_1-\lambda_2)$ be independent. Then we define $Y=A$ and $X=A+B$ and notice that $X\sim\text{Pois}(\lambda_1),Y\sim\text{Pois}(\lambda_2)$. Since $X-Y=B\sim\text{Pois}(\lambda_1-\lambda_2)$ we have $$ E[\lvert X-Y\rvert]=E[X-Y]=\lambda_1-\lambda_2=\lvert\lambda_1-\lambda_2|. $$
Note that we could have done the same with e.g. the normal distribution.