I am failing to understand why the integral is defined as:
$$\int_a^b f(x) dx = \lim_{n\to\infty}\sum_{i=1}^n f(x_i^*)\Delta x$$
instead of:
$$\int_a^b f(x)dx=\sum_{i=1}^\infty f(x_i^*)\Delta x$$
Is the former just popular preference or is there something I am not conceptually understanding here?
Let's cut $[a, b]$ up into infinitely many equally spaced slices. So $x_1 = a$, obviously. What's $x_2$? What's $\Delta x$, if not zero?
In general, there isn't a "nice" way to cut a finite interval into infinitely many sampling intervals. Ultimately, the Riemann integral samples a bunch of function values in a fairly uniform way (meaning one from each interval of length $\Delta x$) and averages them. It doesn't make sense to make an infinite uniform sampling of a bounded interval.