I am working with the script of Günter Last and Mathew Penrose Lectures on the Poisson Process (available online).
Definition 2.4 We shall refer to a point process $\eta$ on $\mathbb{X}$ as a proper point if there exist random variables $X_1, X_2, \dots$ in $\mathbb{X}$ and a $\overline{\mathbb{N}_0}$-valued random variable $\kappa$ such that almost surely $$\eta = \sum_{n=1}^\kappa \delta_{X_n} $$
At the end of the section is struggle with
Exercise 2.4: Let $\eta_1, \eta_2, \dots$ be a sequence of proper point processes. Show that $\eta:= \eta_1 + \eta_2 + \dots$ is a proper point process
My approach: My main problem is that I can't exactly reproduce the Definition 2.4, evidently I have for every $i \in \overline{\mathbb{N}}$ that $$\eta_i = \sum_{n=1}^{\kappa_i} \delta_{X_{n,i}} $$ Where the $\kappa_i$ are $\overline{\mathbb{N}_0}$-valued RV and the $X_{n,i}$ are the random variables for every $i \in \overline{\mathbb{N}}$, therefore $$ \eta = \sum_{i=1}^\infty \eta_i= \sum_{i=1}^\infty\sum_{n=1}^{\kappa_i} \delta_{X_{n,i}}\overset{?}=\sum_{n=1}^\gamma \delta_{X_n} $$ Where the Questionmark refers to Definition 2.4, can I define such a suitable RV $\gamma$ such that my double sum collapses into a (single) sum? My naive approach seems to lead me nowhere so far.
Update: I believe one approach that could get me out of my dilemma would be to take the random variables $K_1, K_2, \dots$ with values in $\overline{\mathbb{N}_0}$ and enumerate them anew such that $$K_1\leq K_2 \leq \dots \leq K_\infty:= \max_{i \in \overline{\mathbb{N}_0}} \{ K_i \} $$ This should allow me to naturally redefine $$ \lbrace \underbrace{X_{1,1}}_{=Y_1}, \underbrace{X_{2,1}}_{=Y_2}, \dots, \underbrace{X_{K_1,1}}_{=Y_{K_1}}, \underbrace{X_{1,2}}_{=Y_{K_1+1}}, \dots , \underbrace{X_{K_2,2}}_{=Y_{K_2}}, \dots ,\underbrace{X_{K_n,n}}_{Y_{K_n}}, \dots \rbrace $$
All of the $Y_i$ are indeed RV in $\mathbb{X}$ and $K_\infty$ is a RV with values in $\overline{\mathbb{N}_0}$, thus I can write $$\eta = \sum_{i=1}^{K_\infty} Y_i $$
I think you have the exact right approach to this. I had to solve this problem for my thesis and wanted to share my solution.
To show, that $$ \eta = \sum_{i=1}^\infty\eta_i =\sum_{i=1}^\infty\sum_{n=1}^{\kappa_i}\delta_{X_{n,i}} \overset{(\ast)}{=}\sum_{n=1}^\kappa \delta_{Y_n} $$ is a proper point process, we have to define the $\kappa$ and $(Y_n)_{n\in\mathbb{N}}$. We need do that for every $\omega\in\Omega$, so that $(\ast)$ holds $\mathbb{P}$-a.s.
I see two mistakes in your solution:
For that, consider an enumeration of $\mathbb{N}^2$. One such enumeration is obtained by the inverse of the function $$\phi:\mathbb{N}^2\to \mathbb{N},\quad \phi(n,i)=n+\frac{1}{2}(n+i-1)(n+i-2).$$ To check for yourself, that $\phi$ is a bijection, calculate a few pairs and arrange them in a matrix with corresponding indices.
Now for the technical part of the re-enumeration: For a fixed $\omega\in\Omega$, we have to "skip" all the pairs of indices $(n,i)$, for which $n>\kappa_j(\omega)$, since those are the indices, that do not appear on the left side of $(\ast)$. For that, we define $$\psi_n(\omega):=\phi^{-1}\left(n + \sum_{(m,j)\in\mathbb{N}^2} \chi\big\{\phi(m,j)\leq n,\: m >\kappa_j(\omega)\big\} \right).$$ Here, $\chi_A=\chi\{\omega\in A\}$ denotes the indicator function of the subset $A\subseteq\Omega$. To be a bit more explicit, we have for each pair $(m,j)\in\mathbb{N}^2$ the set $$A_{m,j}:=\big\{\omega\in\Omega:\:\underbrace{\phi(m,j)\leq n}_{(1)},\: \underbrace{m >\kappa_j(\omega)}_{(2)}\big\}.$$ Condition (1) asserts, that only finitely many indices are even considered in the sum over all $(m,j)\in\mathbb{N}^2$. This is needed for $\psi_n(\omega)$ to be well-defined. Condition (2) is the "skipping" condition from above.
Since the $\kappa_j$ are measurable, all $A_{m,j}$ are measurable subsets of $\Omega$. Thus all indicator functions are measurable, as well as $\psi_n:\Omega\to\mathbb{N}^2$ for every $n\in\mathbb{N}$.
These $\psi_n$ give us the required measurable re-enumeration in the following sense: Define for each $n\in\mathbb{N}$ the random point $Y_n:\Omega\to\mathbb{X}$ as $$Y_n(\omega):=X_{\psi_n(\omega)}(\omega):=X_{m,j}(\omega),\text{ when } \psi_n(\omega)=(m,j).$$ This is well-defined by construction ($\phi$ bijective, condition (2)). Measurability comes from the fact, that $X_{m,j}$ is measurable by assumption.
Finally, $(\ast)$ holds by the rearrangement theorem of series, applied to measures.