Blackwell's argument: Quadratic variation as an upper bound to an expected value in Kingman's book on stochastic processes

33 Views Asked by At

Let $S$ be a Borel subset of a complete measurable metric space, and $S^{*}:=S\times(0,\infty)$. There exists a countable family of subsets $B_{1},B_{2},...\subseteq S$ with the property that for any two points $x_{1}\neq x_{2}$ of $S$ there exists an $n\in\mathbb{N}$ so that $B_{n}$ contains one of the two but not the other.

It is therefore possible to associate to $x$ an element $\xi(x)=(\xi_{1}(x),\xi_{2}(x),...)\subseteq\{0,1\}^{\infty}$, where $$\xi_{n}(x)=\begin{cases}1 & \text{if }x\in B_{n}\\0 & \text{else.}\end{cases}$$ It is convenient not to distinguish between $x$ and $\xi(x)$.

For $x\in S$ let $x_{n}:=(\xi_{1}(x),...,\xi_{n}(x))$, and in $y\in\{0,1\}$ let $x_{n}y=(\xi_{1}(x),...,\xi_{n}(x),y)$. If $\sigma=(\epsilon_{1},...,\epsilon_{n})\in\{0,1\}^{n}$ let $\left<\sigma\right>:=\left\{x\in S:x_{n}=\sigma\right\}$.

Kingman, Chapter (8.2) constructs a Poisson process $\Pi^{*}$ on $S^{*}$ and a purely atomic measure $\Psi$ so that $\Phi$ and $\Psi$ have the same distributions, and in chapter (8.3) he defines a measure $\gamma$ by the condition that \begin{equation*} \mathbb{E}(\sum\{z^{\frac{1}{2}};(x,z)\in\Pi^{*},x\in\left<\sigma\right>\})=\int z^{\frac{1}{2}}\gamma(\left<\sigma\right>,dz) \end{equation*} Kingman now makes the claim that $\mathbb{E}(w_{n}\Phi(S))\leq\sum_{\sigma\text{ of length }n}\rho\left<\sigma\right>^{2}$. (In fact he writes $\mathbb{E}(w_{n}(S))$ instead of $\mathbb{E}(w_{n}\Phi(S))$, but I assume this is a typographic error.) I do not understand why the sum of the $\rho\left<\sigma\right>^{2}$ is an upper bound.

I apologize in advance if the answer should be obvious, since probability theory is not my field of expertise.