Take $f \in L^1(\mathbb{R})$ and define $g(x) = \int_x^{x+1} f(t) \, dt$. If $g(a) > 0$ and $g(b) < 0$, is it necessarily true that there is some $c \in [a,b]$ such that $g(c) = 0$?
I feel as though this should be a one-line proof but the working with the "sliding window" is giving me difficulty.
Edit: is it as simple as applying the Lebesgue differentiation theorem? I feel as though there's a subtlety here I am not seeing.
The integral is a continuous function. Our function is continuous being a difference of continuous functions. We use then the intermediate value theorem.