Here is the most common and basic version of the Mean Value Theorem for Integrals:
Let $f$ be continuous on $[a, b]$. Then there exists $c \in [a, b]$ such that $f(c) = \frac{1}{b-a} \int_a^b f(x) dx$.
My question is: Can we change the existence interval for $c$ from $[a, b]$ to $(a, b)$? And if so, why is the version above much more commonly seen even though it is weaker?
Two proofs are usually presented, one by applying the Mean Value Theorem for Derivatives to the function $F(x) = \int_a^x f(t) dt$; the other using the Extreme Value Theorem and the Intermediate Value Theorem.
For the second method, I think we can avoid some technicalities by considering the case where $f$ is a constant function and the case where $f$ is not a constant function separately.
Otherwise, I don't see why we need $c \in [a, b]$ instead of the tighter one $c \in (a, b)$.
Edit: Here are some examples which use $[a, b]$:
- p.474 of the book "Calculus - Early Transcendetals, 9th Edition" by James Stewart et al.
- Exercise 16 on p.215 of the book "Introduction to Real Analysis, 4th Edition" by Robert G. Bartle et al.
- https://math.libretexts.org/Courses/Monroe_Community_College/MTH_210_Calculus_I_(Seeburger)/05%3A_Integration/5.03%3A_The_Fundamental_Theorem_of_Calculus