Proving the Lebesgue decomposition theorem

131 Views Asked by At

The book I'm following proves that every measure $\mu$ on $\mathbb{R}$ can be uniquely represented as $\mu = \mu_{discrete} + \mu_{abs.cont.} + \mu_{singular}$ using the notion of candidate density at one of the steps. That is, a candidate density is any function $g \geq 0$ such that $\int_E g d\lambda \leq \mu(E)$ for any Borel set $E$ (where $\mu$ is assumed to have no discrete component, which is easily achievable by subtraction).

Then the proof shows that for every pair of candidate densities $g_1, g_2$ their $\max(g_1, g_2)$ is also a candidate density: indeed, the claim is that \begin{align*} \int_E \max(g_1, g_2) d\lambda & = \int_{E \cap \{g_1 \geq g_2\}} g_1 d\lambda + \int_{E \cap \{g_1 < g_2\}} g_2 d\lambda \\ & \leq \mu(E \cap \{g_1 \geq g_2\}) + \mu(E \cap \{g_1 > g_2\}) = \mu (E). \end{align*} The question I'm having is I'm not sure the inequality is justified here. For it to hold by the definition of candidate density, $E \cap \{g_1 \geq g_2\}$ shall also be a Borel set, but what stops $g_1$ and $g_2$ from behaving badly enough so that $\{g_1 \geq g_2\}$ is not a Borel set, breaking the whole intersection?