I am struggling on Example 13.1(b) in the second volume of "An Introduction to the Theory of Point Processes" by Daley and Vere-Jones.
Let $\xi = \sum_{j=1}^n \kappa_j \delta_{x_j}$ a random measure on $\mathcal{X}$. Assume that
- $P\{\xi \text{ has $n$ atoms}\} = p_n$ and that $\mu := \sum n p_n < \infty$
- The atom's location are i.i.d with distribution $F(\cdot)$
- The masses are independent with conditional distribution $\Pi(\cdot \mid x)$ that might depend on the associated atom.
The goal is to study $$ \mathbb E \int_{\mathcal X} g(x, \xi) \xi (\mathrm d x) $$
The authors write $y_j = (x_j, \kappa_j) \in \mathcal{Y} = \mathcal{X} \times \mathbb R^+$ and $g(x, \xi) = \alpha(x) h_n(y_1, \ldots, y_n)$, where $h_n$ is a symmetric function in its arguments. Hence the expected value takes the form
$$ \sum_{n=1}^\infty p_n \int_{\mathcal{Y}^{(n)}} \left[ \sum_{j=1}^n \kappa_j \alpha(x_j) h_n(y_1, \ldots, y_n) \right] \prod_{i=1}^n F(\mathrm d x_i) \Pi(\mathrm d\kappa_i \mid x_i) $$
And then write:
Because $h_n$ and the joint distribution are symmetric, this can be rewritten as $$ \mu \int_{\mathcal Y} \kappa \alpha(x) \Pi(\mathrm d\kappa \mid x) F(\mathrm d x) \int_{\mathcal Y^{(n-1)}} \left[\sum_{j=1}^{n-1} \kappa_j h_{n-1}(y_1, \ldots, y_{n-1}) \right] \prod_{i=1}^{n-1} F(\mathrm d x_i) \Pi(\mathrm d\kappa_i \mid x_i) $$
This is very counter-intuitive to me, is there something obvious I am missing?