Which probability inequality is applied in this proof?

84 Views Asked by At

I'm trying to understand which probability inequality is used at the end of the following proof of Kingman's theorem in Revuz' Markov Chains book. He's considering a probability space with probability measure $m$ and a $m$-preserving (i.e. $m\circ\theta^{-1}=m$) measurable $\theta$.

enter image description here enter image description here enter image description here

It looks like Markov's inequality but this would yield $\le\varepsilon^{-1}\int\sup_{0\le r<k}s_k\:{\rm d}m\sum_{n\in\mathbb N}\frac1n$, where $\sum_{n\in\mathbb N}\frac1n=\infty$.

1

There are 1 best solutions below

4
On BEST ANSWER

Recall that the Layer-Cake Representation will allow us to write $$\int \sup_{0 \leq r < k} s_r dm = \int_0^\infty m(\sup_{0 \leq r < k} s_r > t) dt$$ since $s_r$ is assumed to be positive.

Now by a change of variables, you get that \begin{align*} \int \sup_{0 \leq r < k} s_r dm =& \varepsilon \int_0^\infty m(\sup_{0 \leq r < k} s_r > \varepsilon s) ds \\ \geq & \varepsilon \int_0^\infty m(\sup_{0 \leq r < k} s_r > \varepsilon \operatorname{ceil}(s)) ds \\ =& \varepsilon \sum_{n=1}^\infty m(\sup_{0 \leq r < k} s_r > \varepsilon n) \end{align*} which rearranges to the desired inequality.