I am trying to convert an integral to a Riemann sum like this:
$$ \int_a^b f(x) \,dx = \lim_{n\to \infty} \sum_{k=1}^n f(x_i)\Delta x $$
Where, $\Delta x = \frac{b-a}{n} $ and $x_i = a + i \Delta x$.
My attempt:
$$ \int_1^{n+1} f(x)\, dx = \lim_{n\to \infty} \sum_{k=1}^n f(1+k)$$
Since $\Delta x = 1$ and $ x_i = 1+i$
I do believe this is wrong though. How do I take into account the upper bound $n+1$?
Thank you.
In this case you've got $\Delta x = \frac{n}{k}$ and $x_i = 1 + \frac{in}{k}$ so that $$\int_1^{n+1} f(x) \, dx = \lim_{k\to\infty} \sum_{i=1}^k f(1 + \frac{in}{k})\frac{n}{k}.$$
Note that $n$ is a constant in this example, so you need to choose another letter, say $k$, to represent the variable that determines the number of rectangles in your approximation. Then the variable ``i'' is the `dummy variable' that tells which rectangle to refer to when computing the sum.