Mean value theorem for sliding window of Lebesgue integral of integrable function

168 Views Asked by At

Take $f \in L^1(\mathbb{R})$ and define $g(x) = \int_x^{x+1} f(t) \, dt$. If $g(a) > 0$ and $g(b) < 0$, is it necessarily true that there is some $c \in [a,b]$ such that $g(c) = 0$?

I feel as though this should be a one-line proof but the working with the "sliding window" is giving me difficulty.

Edit: is it as simple as applying the Lebesgue differentiation theorem? I feel as though there's a subtlety here I am not seeing.

2

There are 2 best solutions below

1
On BEST ANSWER

The integral is a continuous function. Our function is continuous being a difference of continuous functions. We use then the intermediate value theorem.

0
On

The function $G(t) = \int_0^t f(t) dt$ is continuous. I'll show this for non negative $f$, the general case is then the addition of two continuous functions. Let $A_n = f^{-1}([n, n+1)).$ Given an $\epsilon >0$, define N such that $\sum_{n\geq N} \int_{A_n} f(t) dt < \epsilon$ and $\delta = \sum_{n\geq N} \lambda(A_n)$. Then I claim that $\epsilon$ and $\delta$ are related as required by the definition of continuity, so $G$ is continous.

I claim that this implies that $g$ is continuous, and so we can simply use the intermediate value theorem.