Context:
Let $x \in \mathbb{R}^n$ be the unknown probability vector of a finite discrete distribution $X$. We are able to sample $X$ and we want to learn $x$.
Poissonization:
Each observation belongs to the $i^\text{th}$ category with probability $x_i$, thus for a sample of size $m \in \mathbb{N}$, the sum of the $i^\text{th}$ category follows a binomial distribution $B(x_i\ ,\ m)$. These binomial random variables are not independent since their sum is $m$ ($X$ is a distribution). However, I found a trick online : if you sum over a $M \sim \mathrm{Poisson}(m)$ sample size rather than $m$, the sum $M_i$ of the $i^\text{th}$ category is no longer binomial but follows $\mathrm{Poisson}(m \times x_i)$ law. Furthermore, these $n$ Poisson random variables are independent! I try to prove this.
My approach
I proved that $M_i \sim \mathrm{Poisson}(m \times x_i)$ for any $i$ : $$ \sum_{j=0}^\infty \left(\ P[\mathrm{Poisson}(m)=j] \times P[B(p,j)=k]\ \right) = P[\mathrm{Poisson}(p\times m)=k]$$ $\iff$ $$ \sum_{j=k}^{\infty} \left(\ P[\mathrm{Poisson}(m)=j] \times P[B(p,j)=k]\ \right) = P[\mathrm{Poisson}(p\times m)=k]$$ $\iff$ $$ \sum_{j=k}^{\infty} \left(\frac{e^{-m} m^j}{j!} \times \frac{j!\ p^k (1-p)^{j-k}}{k!\ (j-k)!} \right) = \frac{e^{-pm} (p m)^k}{k!}$$ $\iff$ $$e^{-m} \sum_{j=k}^\infty \left(\frac{m^j (1-p)^{j-k}}{(j-k)!} \right) = e^{-pm} m^k$$ $\iff$ $$e^{-m} \sum_{j'=0}^\infty \left(\frac{m^{j'+k} (1-p)^{j'}}{j'!} \right) = e^{-pm} m^k$$ $\iff$ $$e^{-m} e^{m(1-p)} m^k = e^{-pm}m^k $$ $\square$
Now, how can I prove that all those $n$ random variables are independent ? I found this paper which states that even if $X_1$, $X_2$ and $X_1 + X_2$ are $\mathrm{Poisson}$, $X_1$ and $X_2$ don't have to be independent. But here we are not in the general case since we have $\sum\limits_{i=1}^n x_i = 1$.
Ok let's start by stating the result you are looking for (according to my interpretation):
I will show the result for $n=2$, it is straightforward to generalize this to $n \geq 2$. In the case $n=2$ we can state the result in a simpler way, namely:
$$ m \sim \text{Pois}(\lambda)$$
and conditional on $m$: $$ \begin{align} X_1 &\sim \text{Binomial}(m, p_1) \\ X_2 = m - X_1 &\sim \text{Binomial}(m, 1-p_1) \end{align} $$
Now let $k,l \in \mathbb N$. Since you already derived the marginal distributions, it follows that:
$$\Pr\left[X_1=k\right] = \frac{e^{-p_1\lambda}(p_1\lambda)^k}{k!} \;,\; \Pr\left[X_2=l\right] = \frac{e^{-(1-p_1)\lambda}((1-p_1)\lambda)^l}{l!}$$
It also holds that:
$$ \begin{align} \Pr\left[X_1=k, X_2=l\right] &= \Pr\left[X_1=k, m=k+l\right] \\ &= \Pr\left[X_1=k \; \vert \; m=k+l\right]\Pr\left[m=k+l\right] \\ &= \binom{k+l}{k}p_1^k(1-p_1)^l \frac{e^{-\lambda}\lambda^{k+l}}{(k+l)!} \\ &= \Pr\left[X_1=k\right]\Pr\left[X_2=l\right] \end{align} $$
From the second to the third line I used the fact that $m \sim \text{Pois}(\lambda)$ and that $X_1 \big\vert m = k+l \sim \text{Binomial}(k+l, p_1)$.
This proves the independence of $X_1$ and $X_2$.