Suppose $f:[a,b]\to\mathbb{R}$ is Riemann integrable. Prove that $\int_{a}^{b}f=\lim_{n\to\infty} \frac{b-a}{n}\sum_{j=1}^{n} f(a+\frac{j(b-a)}{n}).$

452 Views Asked by At

I am reading "Measure, Integration & Real Analysis" by Sheldon Axler.
The following exercises are Exercise 7 and Exercise 8 in Exercises 1A on p.8.

Exercise 7 on p.8:
Suppose $f:[a,b]\to\mathbb{R}$ is a bounded function. For $n\in\mathbb{Z}^+$, let $P_n$ denote the partition that divides $[a,b]$ into $2^n$ intervals of equal size. Prove that $$L(f,[a,b])=\lim_{n\to\infty} L(f,P_n,[a,b]) \text{ and }U(f,[a,b])=\lim_{n\to\infty} U(f,P_n,[a,b]).$$

Exercise 8 on p.8:
Suppose $f:[a,b]\to\mathbb{R}$ is Riemann integrable. Prove that $$\int_{a}^{b} f = \lim_{n\to\infty} \frac{b-a}{n}\sum_{j=1}^{n} f(a+\frac{j(b-a)}{n}).$$

I want to solve Exercise 8.
I know the following theorem:

THEOREM 1 (from Michael Spivak "Calculus Fourth Edition"):
Suppose that $f$ is integrable on $[a,b]$. Then for every $\epsilon>0$ there is some $\delta>0$ such that, if $P=\{t_0,\dots,t_n\}$ is any partition of $[a,b]$ with all lengths $t_i-t_{i-1}<\delta$, then $$\left|\sum_{i=1}^{n} f(x_i)(t_i-t_{i-1})-\int_{a}^{b} f(x) dx \right|<\epsilon,$$ for any Riemann sum formed by choosing $x_i$ in $[t_{i-1},t_i]$.

To solve Exercise 8, it is sufficient to copy and paste Michael Spivak's proof of THEOREM 1.

But I guess the author's solution to Exercise 8 in his head is different from the above solution.

Can we use the result of Exercise 7 as a lemma to solve Exercise 8?