The derivative function has the following definition using the limit: $$f'(x) = \lim_{h\to0} \frac{f(x+h)-f(x)}{h}$$
I was wondering whether I could find a similiar definition for the integral. I found:
$$\int_0^1 f(x) dx = \lim_{i\to\infty} \sum_{j=0}^{i} \frac{1}{i+1} f\left(\frac{j}{i}\right)$$
Of course, this could be extended to
$$\int_a^b f(x) dx = \lim_{i\to\infty} \sum_{j=0}^{i} \frac{1}{i+1} f\left((b-a)\frac{j}{i}+a\right)$$
I have verified this for $f(x)=x$ and constant $f(x)$, and it seemed to work, but that kind of limits are kind of hard to work with, because they contain a sum.
I have looked to the Riemann integration, but the formula does seem to be different.
Questions:
- Does my limit defintion make sense?
- Is there any reason to work with this? It seems not, as even something simple as $f(x)=x$ is quite hard with this. But maybe there are certain types of integrals where this works out quite nice.
- Why does this formula look so different form that form Riemann?
You are on the right track. Both the derivative and the integral are defined using limits. The one for the integral is harder to read, perhaps harder to understand, certainly harder to calculate with. Your formula is in fact an example of a Riemann sum. You don't need the more general form to understand the idea: you are approximating an area by a collection of thin rectangles.
The Greeks and some renaissance mathematicians knew how to calculate some areas with this kind of approximation strategy.
What Newton and Leibniz discovered when they invented calculus (or discovered it, depending on your philosophy of mathematics) is the Fundamental Theorem of Calculus, which says (pretty much) that if you can somehow guess or figure out an antiderivative for a function then you can calculate its integral (an area) without having to think explicitly about adding up the areas of rectangles that approximate it.