Why does the monotone convergence theorem not apply on Riemann integrals?

3.1k Views Asked by At

I had just learned in measure theory class about the monotone convergence theorem in this version:

For every monotonically increasing sequence of functions $f_n$ from measurable space $X$ to $[0, \infty]$, $$ \text{if}\quad \lim_{n\to \infty}f_n = f, \quad\text{then}\quad \lim_{n\to \infty}\int f_n \, \mathrm{d}\mu = \int f \,\mathrm{d}\mu . $$

I tried to find out why this theorem apply only for a Lebesgue integral, but I didn't find a counter example for Riemann integrals, so I would appreciate your help.

(I guess that $f$ might not be integrable in some cases, but I want a concrete example.)

3

There are 3 best solutions below

2
On BEST ANSWER

Riemann integrable functions (on a compact interval) are also Lebesgue integrable and the two integrals coincide. So the theorem is surely valid for Riemann integrals also.

However the pointwise increasing limit of a sequence of Riemann integrable functions need not be Riemann integrable. Let $(r_n)$ be an ennumeration of the rationals in $[0,1]$, and let $f_n$ be as follows:

$$f_n(x) = \begin{cases} 1 & \text{if $x \in \{ r_0, r_1, \dots, r_{n-1} \}$} \\ 0 & \text{if $x \in \{ r_n, r_{n+1}, \dots \}$} \\ 0 & \text{if $x$ is irrational} \\ \end{cases}$$

Then the limit function is nowhere continuous, hence not Riemann integrable.

3
On

Here is a version of the Monotone convergence theorem for Riemann integrals that can be proved without referring to measure theory:

Theorem. Let $\{f_n\}$ be a nondecreasing sequence of Riemann integrable functions on $[a,b]$ converging pointwise to a Riemann integrable function $f$ on $[a,b]$. Then $$ \lim_{n\to \infty}\int_a^b f_n(x)\,dx=\int_a^b f(x)\,dx. $$

An elementary proof is given in this paper.

0
On

I spent about two weeks working on an elementary proof of this problem earlier this year.

There is an elementary proof of something slightly stronger: Let $f_n$ tend pointwise to $0$ on $[0,1]$, with $f_n$ Riemann integrable, and for all $x$ sup $|f_n(x)| \leq 1$. (This has the obvious generalisation to an arbitrary interval, with convergence pointwise to a Riemann integrable function and $f_n$ uniformly bounded).

I would be happy to write up my proof after my exams are finished this summer :)

This is a slightly harder version of a problem given on this problem set for second year undergraduates at the university of cambridge, although that also allows the assumption that the $f_n$ are continuous.