An integral in two varaiables without using Tonelli

236 Views Asked by At

I had the following question in my mid-term exam of "real analysis for graduates" course. The professor had taught Bruckner's up to chapter 5 so hadn't taught Tonelli-Fubini Theorem and "by rule" we were not allowed to use that theorem because it is not a content of undergraduate courses!

So here is the question :

Let $(X, \mathcal{M}, \mu)$ be a (positive) measure space and $f: X \to \mathbb{R}$ be a measurable function. For $t>0$ define $\lambda(t) = \mu({\{x \in X : |f(x)|>t}\})$. Show that if we consider Lebesgue measure on $(0,\infty)$ we have $\int_X |f| d \mu = \int_{(0,\infty)} \lambda(t) dt$.

The instructor didn't solve the problem but mentioned that the solution is based on first showing for simple functions and then proving the main theorem by taking limit.

PS I will offer a bounty of 500 for the accepted answer.

2

There are 2 best solutions below

0
On

Here's an initial push to get you started.

Suppose that $f(x) = \sum_{k=1}^n a_k 1_{A_k}$, with $a_0 = 0 < |a_1| \leq \cdots \leq |a_n|$. We find that $\lambda(t) = \sum \{\mu(A_k):|a_k| > t\}$, where I use $\sum S$ to denote the sum of the elements of the (finite) set $S$. We see that $$ \begin{align} \int_{(0,\infty)} \lambda(t)\,dt &= \sum_{j=1}^n \int_{|a_{j-1}|}^{|a_j|} \lambda(t) \\ & = \sum_{j=1}^n \int_{|a_{j-1}|}^{|a_j|} \sum_{k=j}^n \mu(A_k) = \sum_{j=1}^n \sum_{k=j}^n (|a_j| - |a_{j-1}|)\mu(A_k). \end{align} $$ On the other hand, compute $$ \int_X |f|\,d\mu = \sum_{k=1}^n |a_k|\cdot \mu(A_k). $$ Show that these two sums are equal.

0
On

Lemma: Let $h$ be a simple function such that $h \geq 0$ and $h \ne 0$. Let $\lambda_h(t) = \mu(\{x\in X : h(x) >t \})$. Then $$\int_X h d\mu = \int_{(0, \infty)} \lambda_h(t) dt$$ Proof: First, note that given any $h$ be a simple function such that $h \geq 0$ and $h \ne 0$, we have that $h= \sum_{k=1}^n a_k \chi_{A_k}$, where:

  1. $A_k$ are pairwise disjoint measurable sets
  2. $0 < a_1 < a_2 < \cdots < a_n$.

Note that $$\lambda_h(t) = \mu(\{x\in X : h(x) >t \})= \sum_{k\in \{r \ : \ a_r >t\}} \mu(A_k)$$ So we have $\lambda_h$ is constant in the intervals $(0, a_1), [a_1, a_2), \cdots , [a_{n-1}, a_n), [a_n, \infty)$. Note that in $[a_n, \infty)$, $\lambda_h=0$.

So, defining $a_0 = 0$, we have
\begin{align*} \int_{(0, \infty)} \lambda_h (t) dt &= \sum_{i=1}^n \left ((a_{i} -a_{i-1}) \sum_{k=i}^n\mu(A_k) \right) = \\ &= \sum_{i=1}^n \left (a_{i} \sum_{k=i}^n\mu(A_k) \right) - \sum_{i=1}^n \left (a_{i-1} \sum_{k=i}^n\mu(A_k) \right) = \\ & = \sum_{i=1}^n \left (a_{i} \sum_{k=i}^n\mu(A_k) \right) - \sum_{i=0}^{n-1} \left (a_{i} \sum_{k=i+1}^n\mu(A_k) \right) = \\ &=a_n \mu(A_n) + \sum_{i=1}^{n-1} \left (a_{i} \sum_{k=i}^n\mu(A_k) \right) - \sum_{i=1}^{n-1} \left (a_{i} \sum_{k=i+1}^n\mu(A_k) \right) = \\ &= a_n \mu(A_n) + \sum_{i=1}^{n-1} \left (a_{i} \left (\sum_{k=i}^n\mu(A_k) -\sum_{k=i+1}^n\mu(A_k) \right) \right ) = \\ & = a_n \mu(A_n) + \sum_{i=1}^{n-1} a_i \mu(A_i) = \\ &= \sum_{i=1}^{n} a_i \mu(A_i) = \int h d\mu \end{align*} which completes the proof. $\square$

Now let us prove the main result.

If $| f | $ is identically $0$, then the result is trivial. So, let us suppose that $|f| \ne 0$.

Let $h_n$ be a sequence of simple functions such that for all $n$, $0\leq h_n \leq |f|$, $h_n \ne 0$ and $h_n \nearrow |f|$.

Then we have, for all $t>0$, $\{x\in X : h_n(x) >t \} \nearrow \{x\in X : |f(x)| >t \}$. So, for all $t>0$, $$\lambda_{h_n}(t)= \mu(\{x\in X : h_n(x) >t \}) \nearrow \mu(\{x\in X : |f(x)| >t \}) = \lambda(t)$$ Now, applying the Monotone Convergence Theorem and the previous lemma, we have $$ \int_X |f| d\mu = \lim_n \int_X h_n d\mu = \lim_n \int_{(0, \infty)} \lambda_{h_n} (t) dt =\int_{(0, \infty)} \lambda(t) dt $$

Remark: The more natural way to solve this exercise is to use Tonelli's Theorem. But, as you stated, you are not supposed to use Tonelli's Theorem.