I'm working on a nice problem in measure theory. Given $\mathbb{R}$ with its standard Lebesgue measure on its $\sigma$-algebra and a Lebesgue integrable function $f: \mathbb{R} \rightarrow \mathbb{C}$, what could we say about the limit $$\lim_{t\rightarrow 0} \int |f(x)-f(x+t)|d\lambda(x)$$ Of course with continuous functions, this limit is always equal to zero. But I can see some difficulties arrising quickly. Maybe one can force this integral to be nonzero somewhere when taking a suitable limit for $t$.
Certainly when trying to make this into a rigorous proof (that the limit is equal to zero), I did not see a clear way to get this result.
I'm not looking for a complete solution but any hints would be appreciated!
Some hints:
Prove this statement for an integral over an arbitrary segment $[a,b]$. You can use the fact that continuous functions are dense in $L_1[a,b]$ and then Cantor’s theorem.
To prove the original statement, for any $\varepsilon>0$ find a sufficiently large $N$ that $\displaystyle\int\limits_{|x|\geq N}|f(x)|d\lambda<\frac{\varepsilon}{3}$, then $\displaystyle\int\limits_{\mathbb{R}}|f(x+t)-f(x)|d\lambda=\displaystyle\int\limits_{|x|< N+1}|f(x+t)-f(x)|d\lambda+\displaystyle\int\limits_{|x|\geq N+1}|f(x+t)-f(x)|d\lambda$. For the first integral there is a $\delta\in(0,1)$ such that for $|t|\leq\delta$ it's $<\frac{\varepsilon}{3}$, and second integral $<\frac{2\varepsilon}{3}$.