Does this process for building partitions lead to convergence to Riemann integral?

40 Views Asked by At

Let $f: [0,1] \to \mathbb{R}$ be continuous. Let us generate a sequence of partitions $P_m$ of $[0,1]$ in the following inductive way: Define $P_1: 0 = t_{1,0} < t_{1,1} = 1$. Given some partition $P_m : 0 = t_{m,0} < t_{m,1} < \cdots < t_{m,m} =1$, select the subinterval $[t_{m,i-1}, t_{m,i}]$ for which $\sup_{t \in [t_{m,i-1}, t_{m,i}]} |f(t) - f(t_{m,{i-1}})|$ is largest, and form the new partition

$$P_{m+1}:0 = t_{m,0} < t_{m,1} < \cdots < t_{m, i-1} < \frac{t_{m,i-1} + t_{m,i}}{2} < t_{m,i} < \cdots < t_{m,m} =1.$$

In the event of a tie, chose to subdivide the leftmost subinterval of largest size.

Is it the case that $\lim_{m \to \infty} \sum_{i =1}^m f(t_{m,i-1})(t_{m,i} - t_{m,i-1}) = \int^1_0f(t)dt$?

It seems to me that, under this algorithm, one should be able to argue that the norm of $P_m$ tends to zero, whence the given Riemann sums converge to the Riemann integral by uniform continuity and the dominated covergence theorem. However, I haven't managed to argue this, so there may be a counterexample lurking. Something worrying is that, when a subinterval is divided, $\sup_{t \in [t_{m,i-1}, (t_{m,i-1} + t_{m,i})/2} |f(t) - f(t_{m,i-1})|$ or $\sup_{t \in [(t_{m,i-1} + t_{m,i})/2, t_{m,i}]} |f(t) - f((t_{m,i-1} + t_{m,i})/2)|$ may be larger than $\sup_{t \in [t_{m,i-1}, t_{m,i}]} |f(t) - f(t_{m,i-1})|$.

I would be happy to prove that this algorithm converges for a stricter class of $f$ (Lipschitz continuous, continuously differentiable, etc.).