Poisson process uniquely identified proof: what is $\Gamma_r((\Theta ∟ A_i)^r)$?

49 Views Asked by At

I'm self-taught studying Poisson point processes and I can't understand the proof of existence in the theorem that states that a Poisson Process is uniquely determined given a locally finite measure $\Theta$ (pages 60-62 in [1]). See below the theorem, and proof extracts I do not understand underbraced as $[nu]$ (below the proof extract I ask the questions).

Theorem

Let $\Theta$ be a locally finite measure without atoms on $E$. Then there exists a Poisson process in $E$ with intensity measure $\Theta$; it is uniquely determined (up to equivalence).

Proof

Let's prove the existence.

Since $\Theta$ is locally finite, there are (by Theorem 12.1.1) pairwise disjoint Borel sets $A_1, A_2, \ldots$ in $E$ with $E=\cup_{i\in \mathbb{N}} A_i$, $\Theta(A_i) < \infty$, and such that to each $C \in \mathcal{C}$ there exists $k \in \mathbb{N}$ with $C \subset \cup_{i=1}^k A_i$. In each $A_i$, we define a point process $X^{(i)}$ in the following way. For $r \in \mathbb{N}$, write $A_i^r = A_i \times \ldots \times A_i$ ($r$ factors) and let

$$\Gamma_r: A_i^r \rightarrow\mathsf{N},$$

be the map defined by

$$\Gamma_r(x_1, \ldots, x_r) := \sum_{j=1}^r \delta_{x_j} $$

Then $\Gamma_r$ is $(\mathcal{B}(A_i^r), \mathcal{N})$-measurable. Let $\Delta_0$ denote the Dirac measure on $\mathsf{N}$ concentrated at the zero measure. Then

$$ \mathbb{P}_i := e^{-\Theta(A_i)} \left( \Delta_0 + \sum_{r \in \mathbb{N}} \frac{1}{r!} \Gamma_r(\underbrace{(\Theta ∟ A_i)^r)}_{[nu1]}) \right) \quad (1) $$

is a probability measure on $\mathsf{N}$ which is concentrated on the counting measures $\eta$ with $supp\ \eta \subset A_i$. (For the normalization, observe that $\underbrace{\Gamma_r((\Theta ∟ A_i)^r)(\mathsf{N}) = \Theta(A_i)^r}_{[nu2]}$.) Let $X_1, X_2, \ldots$ be an independent sequence of point processes in $E$ such that $X_i$ has distribution $\mathbb{P}_i$ (for example, we may take as underlying probability measure $\mathbb{P} = \otimes_{i \in \mathbb{N}} \mathbb{P}_i$, and for $X_i$ the $i$th coordinate mapping). Finally, we put

$$ X:= \sum_{i\in \mathbb{N}} X_i$$

[...] $\leftarrow$ I omit for extension reasons

Taking $A\in\mathcal{B}$ a sit with $\Gamma(A) < \infty$ and $A_i':=A\cup A_i$ for $i \in \mathbb{N}$, then

$$ X(A) = \sum_{i\in\mathbb{N}} X_i(A) = \sum_{i\in\mathbb{N}} X_I(A_i').$$

By construction, the random variables $X_1(A_1'), X_2(A_2'), \ldots$ are independent. Again from construction, for each $k \in \mathbb{N}$ we have

$$ \begin{align} \mathbb{P}(X_i(A_i')=k) &= \mathbb{P}_i(\{\eta \in \mathsf{N}: \eta(A_i')=k\})\\ & = \sum_{r=k}^\infty \mathbb{P}_i(\{\eta \in \mathsf{N}: \eta(A_i')=k,\ \eta(A_i \setminus \_i') = r-k\})\\ & \underbrace{=}_{[nu3]}e^{-\Theta(A_i)} \sum_{r=k}^\infty {r \choose k} \frac{1}{r!} \Theta(A_i')\Theta(A_i \setminus A_i')^{r-k} \end{align} $$

[...]

Questions

$[nu1]$: Knowing that $\Theta ∟ A_i: E \cap A_i \rightarrow \mathbb{R}^+$ I don't know what does $(\Theta ∟ A_i)^r$ means? In case $(\Theta ∟ A_i)^r = \Theta ∟ A_i \times \ldots \times \Theta ∟ A_i$ , then I expect its output being at $\mathbb{R}^r$, but $\Gamma_r$ is defined over $A_i^r$.

$[nu2]$: Can someone explain me the equality $\Gamma_r((\Theta ∟ A_i)^r)(\mathsf{N}) = \Theta(A_i)^r$? How can $\Gamma_r$ evaluate $\mathsf{N}$?

$[nu3]$ How does $\mathbb{P}_i$ operates over $\mathsf{N}$ to get the equality of $\Gamma_r((\Theta ∟ A_i)^r) = \Theta(A_i')^r$? How is ${r \choose k}$ introduced here?

Reference

[1] "Stochastic and Integral Geometry", Rolf Schneider and Wolfgang Weil