Question about convergence of Riemann Sums

61 Views Asked by At

I would like to prove the following statement just using the definition of continuity (or uniform continuity) without appealing to any theorem about Darboux sums. Here is the statement:

Let $f:[a,b]\to \mathbb{R}$ be continuous, and for each $n\in \mathbb{N}$, let $\Delta(x,n)=(b-a)/n$, let $x_{i,n}=a+i\Delta(x,n)$ for all $i\in \{0,...,n\}$, let $m_{i,n}$ and $M_{i,n}$ be the minimum and maximum of $f$ on $[x_{i-1,n},x_{i,n}]$ respectively for all $i\in \{1,...,n\}$, let $$ m(f,n)=\sum_{i=1}^n m_{i,n}\Delta(x,n) \quad \& \quad M(f,n)=\sum_{i=1}^n M_{i,n}\Delta(x,n), $$ and suppose $\lim_{n\to \infty}(M(f,n)-m(f,n))=0$. Then $\lim_{n\to \infty}M(f,n)=\lim_{n\to \infty}m(f,n)$.

1

There are 1 best solutions below

0
On BEST ANSWER

Not only is the continuity of $f$ not a necessary condition, but it is a redundant hypothesis since

$$\tag{1}\lim_{n\to \infty}(M(f,n)-m(f,n))=0$$ actually follows from the uniform continuity of $f$ on the compact interval $[a,b]$. If $f$ is uniformly continuous there exists $\delta > 0$ such that $|x-y| < \delta$ implies $|f(x) - f(y)| < \epsilon/(b-a)$. Thus, if $n > (b-a)/\delta$ we have $M_{i,n}-m_{i,n} < \epsilon/(b-a)$ for all $i$ and $M(f,n) - m(f,n) < \epsilon$.

Two possibilities come to mind for proving that $\lim_{n\to \infty}M(f,n) = \lim_{n \to \infty} m(f,n)$. One way is first to show independently that there exist numbers $L= \lim_{n\to \infty}M(f,n)$ and $L'= \lim_{n\to \infty}m(f,n)$. Such existential limit proofs either require finding suitable candidates for limits that are directly verified or to infer that limits exist indirectly by showing the sequences have the Cauchy property. It would then follow from (1) that $L = L'$.

The easiest way in this case is to show that there exists a numbers $L$ and $L'$ such that for all $n$,

$$\tag{2}m(f,n) \leqslant L' \leqslant L \leqslant M(f,n)$$

Once (2) is established, an application of (1) immediately shows that $L = L'$. Furthermore because

$$0 \leqslant \begin{cases}M(f,n) - L \\L-m(f,n)\end{cases} \leqslant M(f,n) - m(f,n),$$

it follows by the squeeze principle that $L = \lim_{n\to \infty}M(f,n)= \lim_{n\to \infty}m(f,n)$.

The inequality (2) can be established for any bounded function where $M_{i,n} = \sup_{x \in [x_{i-1,n},x_{i,n}]}f(x)$ and $m_{i,n} = \inf_{x \in [x_{i-1,n},x_{i,n}]}f(x)$. If $f$ happens to be continuous, then the supremum and infimum are maximum and minimum values attained by $f$ on the interval as defined in the problem statement. It can be shown just by properties of the maximum and minimum on partition subintervals that for any $p, q \in \mathbb{N}$ we have

$$M(f,p) \geqslant m(f,q)$$

Thus, there exists $L = \inf\{M(f,n): n \in \mathbb{N}\}$ and $L' = \sup\{m(f,n): n \in \mathbb{N}\}$ where $L' \leqslant L$ and, consequently inequality (2) holds.