In this problem, we have a series of iid uniform rvs on [0,1] and call these RVs particles. We also partition [0,1] into k non overlapping subsets. We define $N$ to be a Poisson process that specifies the total number of particles and $N_j$ to be the number of particles in the $j$th set. The goal is to show that $N_1,...,N_k$ are independent Poisson RVs.
To solve this, I consider each partition as being binomially distributed based on the remaining distributions. Then, my initial idea here is to calculate the marginal from the conditional: $$ P(N_1,...,N_k) = \sum_{n=1}^\infty((\prod_{i=0}^{k}{{n-\sum_{j=0}^im_{j} \choose m_i}*d_{i}^{m_1}*(1-\sum_{j=0}^id_{j})^{n-\sum_{j=0}^im_{j}}})*\frac{\lambda^n*\exp^{-\lambda}}{n!}) $$ where $\prod_{i=0}^{k}{{n-\sum_{j=0}^im_{j} \choose m_i}*d_{i}^{m_1}*(1-\sum_{j=0}^id_{j})^{n-\sum_{j=0}^im_{j}}}$ is the joint distribution for $N_{1},...,N_{k}|N=n$ and $\frac{\lambda^n*\exp^{-\lambda}}{n!}$ is the distribution for N.
but I have no idea how to approach solving this equation, or if I am missing a more simple solution. Any help would be greatly appreciated!
In your first equation should be $n=m_1+\ldots+m_k$. So the sum is not needed there. Next, the joint distribution for $N_1,...,N_k$ conditionally on $N=n$ is not this product. Only the first probability is valid: $$ \mathbb P(N_1=m_1 |N=n) =\binom{n}{m_1}d_1^{m_1}(1-d_1)^{n-m_1}. $$
And $$ \mathbb P(N_2=m_2 |N_1=m_1, N=n) =\binom{n-m_1}{m_2}\left(\frac{d_2}{1-d_1}\right)^{m_1}\left(1-\frac{d_2}{1-d_1}\right)^{n-m_1} $$ and so on. When you conditioning on $\{N_1=m_1\}$, you put $m_1$ points into $[0,d_1]$, and the other $n-m_1$ points are iniformly distributed inside $(d_1,1]$.
Or you can look at multinomial distribution and simply write $$ \mathbb P(N_1=m_1,\ldots,N_k=m_k |N=n) =\frac{n!}{m_1!\cdots m_k!}d_1^{m_1}\cdots d_k^{m_k}, $$ if $n=m_1+\ldots+m_k$ and zero otherwise.
So, unconditional joint distribution of $N_1,...,N_k$ is $$ \mathbb P(N_1=m_1,\ldots,N_k=m_k) =\frac{(m_1+\ldots+m_k)!}{m_1!\cdots m_k!}d_1^{m_1}\cdots d_k^{m_k} \cdot\frac{\lambda^{m_1+\ldots+m_k}}{(m_1+\ldots+m_k)!}e^{-\lambda}, $$
To get marginal distributions of $N_i$ using conditional distributions $$ \mathbb P(N_i=m_i\mid N=n) = \binom{n}{m_i}d_i^{m_i}(1-d_i)^{n-m_i}, $$ you can find $$ \mathbb P(N_i=m_i)=\sum_{n=0}^\infty \mathbb P(N_i=m_i\mid N=n) \cdot \mathbb P(N=n) = \frac{(\lambda d_i)^{m_i}}{m_i!}e^{-\lambda d_i}. $$