Regarding averages and mean value theorem for integrals

75 Views Asked by At

The mean value theorem for integrals states: If $f$ is continuous on an interval $I = [a,b]$, then for some $c \in I$ we have: $$\int_{a}^{b}f(x) \cdot dx = f(c)(b-a)$$

In many sources it is said that The Mean value theorem for integrals guarantees that if $f(x)$ is continuous, a point $c$ exists in an interval $[a,b]$ such that the value of the function at $c$ is equal to the average value of $f(x)$ over $[a,b].$

Now my question is how is the average value defined? Like the average value of $n$ numbers $a_1+a_2\cdot + a_n$ is given by $\frac{a_1+a_2\cdot + a_n}{n}$. Here division is by the number of elements. But above we are dividing by the length of the interval. Can anyone please help me understand the transition? I intuitively understand it but not rigourously.

I know how for a nonnegative continuous function $f(x)$ the integral over $[a,b]$ represents the area of the region bounded by $x=a,$ $x=b$, $y=0$ and $y=f(x).$ By the mean value theorem there exists a rectangle with base $b-a$ and height $f(\xi)$ with the same area where $\xi\in [a,b]$.

1

There are 1 best solutions below

0
On BEST ANSWER

Think of how the integral of $f$ is usually defined. You take thinner and thinner rectangles that fit under the curve of $f$ and sum their areas up. In other words, it's the limit of : $$ \sum_{i=1}^N f(x_i) \delta $$ as $N$ grows to infinity, where $\delta=\frac{b-a}{N}$ is the thickness of a rectangle, and $x_i=a+i\delta$, which makes $f(x_i)$ the height of the $i$-th rectangle.

So, that sum converges to $\int_a^b f(x)dx$, but notice that if you were to take the average of the values $(f(x_1),f(x_2),\ldots,f(x_N))$, you'd write it as : $$ \frac{\sum\limits_{i=1}^N f(x_i)}{N} $$ which is almost the sum we defined above. Multiplying the numerator and denominator by $\delta$, you actually get : $$ \frac{\sum\limits_{i=1}^N f(x_i)\delta}{N\delta} = \frac{\sum\limits_{i=1}^N f(x_i)\delta}{b-a} $$ which converges to $\frac{1}{b-a}\int_a^b f(x)dx$ as $N$ grows to infinity! So that's why it seems natural to call this the average value of $f$ over $[a,b]$, as it's the same as taking the average of more and more samples of $f$.