Reducing integrals over abstract spaces to integrals on $\mathbb{R}_{+}$ with respect to the Lebesgue measure

39 Views Asked by At

I am studying for a measure-theoretic based course in probability theory and I am stuck on a theorem that goes as follows.

Definition

First, it defines a measurable space $(E,\mathcal{E})$ to be standard if it is isomorphic to $(F,\mathcal{B}_F)$ for some Borel subset $F$ of $\mathbb{R}$.

Theorem

Let $(E,\mathcal{E})$ be a standard measurable space. Let $\mu$ be a $\Sigma$-finite measure on $(E,\mathcal{E})$ and put $b=\mu(E)$, possibly $+\infty$. Then, there exists a mapping $h$ from $[0, b)$ into $E$, measurable relative to $\mathcal{B}_{[0,b)}$ and $E$, such that \begin{equation*} \mu=\lambda \circ h^{-1} \end{equation*}

My issue is that the textbook does not give a full proof of this theorem. It does give a sort of proof, but one that relies on the result of an exercise (where $(E,\mathcal{E})=\big(\mathbb{R}_+,\mathcal{B}({\mathbb{R}_+})\big)$, $\mu$ is left unspecified, and $h^{-1}$ is replaced by the cumulative distribution function) which is not solved. So I am missing a crucial piece. I worked out such exercise at the best of my possibilities, but I can only get the result under the assumption that the cumulative distribution function is never constant (i.e., it is strictly increasing), so I do not have a proof under more general assumptions.

This theorem is not even given a name in the textbook, which makes it more difficult for me to find it (and a proof) elsewhere. I assume there is a textbook with a formal and full proof of this important result.

If you could give me a proof of this, it would be incredibly helpful, but even just a reference to a source containing a full proof (or a name for the theorem) would be great. Thanks to everyone who can consider replying.