Lebesgue integral of absolute value as difference goes to zero

1.4k Views Asked by At

Suppose $f\in L^1(\mathbb{R},\mu)$. Prove that

$$\lim_{t\rightarrow 0}\int_\mathbb{R}|f(x)-f(x+t)|d\mu=0$$

When I see a limit like this, I want to move the limit inside the integral sign. Usually this can be done by the monotone convergence theorem or the dominated convergence theorem. But here the limit is $t\rightarrow 0$ instead of a sequence of functions with $n\rightarrow\infty$. What can we do?

2

There are 2 best solutions below

0
On BEST ANSWER

I try to prove it for Riemann integrable functions and bounded. The proof works for continuous functions too.

We have $\left|f(x)-f(x+t)\right|\leq |f(x)|+|f(x+t)|$ and therefore it is bounded by a function in $ L^1(\mathbb R,\mu)$.

Using reverse Fatou lemma, for all sequences $\{a_n\}$, we can say: $$ \limsup_{n\to\infty}\int_\mathbb R |f(x)-f(x+a_n)|~d\mu\leq \int_\mathbb R \limsup_{n\to\infty}|f(x)-f(x+a_n)|~d\mu $$ Now for each sequence of $\{a_n\}$ converging to zero, we have the above one. It suffices to show that the discontinuity points of $f$ has zero measure and $f$ is bounded. Now because that the function is Riemann integrable and bounded, we can say that it is almost every where continuous which concludes the proof.

0
On

Let $ f \in {C_{c}}(\mathbb{R}) $. Define an $ \mathbb{R} $-indexed sequence $ (f_{t})_{t \in \mathbb{R}} $ of $ \mathbb{C} $-valued functions on $ \mathbb{R} $ by $$ \forall t \in \mathbb{R}, ~ \forall x \in \mathbb{R}: \quad {f_{t}}(x) \stackrel{\text{def}}{=} f(x + t). $$ As it is clear that $ f \in {L^{1}}(\mathbb{R}) $, the translation invariance of $ \mu $ yields $ f_{t} \in {L^{1}}(\mathbb{R}) $, hence $ f - f_{t} \in {L^{1}}(\mathbb{R}) $, for all $ t \in \mathbb{R} $. We can thus define a function $ F: \mathbb{R} \to \mathbb{C} $ by $$ \forall t \in \mathbb{R}: \quad F(t) \stackrel{\text{def}}{=} \int_{\mathbb{R}} (f - f_{t}) \, d{\mu}. $$ Once we have shown that $ \displaystyle \lim_{n \to \infty} F(t_{n}) = 0 $ for every sequence $ (t_{n})_{n \in \mathbb{N}} $ in $ \mathbb{R} $ that converges to $ 0 $, we can conclude that $ \displaystyle \lim_{t \to 0} F(t) = 0 $, which is precisely what we want.

Let us begin our argument by picking an arbitrary sequence $ (t_{n})_{n \in \mathbb{N}} $ in $ \mathbb{R} $ that converges to $ 0 $. Let $ M := \| f \|_{\infty} $ and $$ E := \bigcup_{n \in \mathbb{N}} \text{supp} \left( f_{t_{n}} \right) = \bigcup_{n \in \mathbb{N}} [\text{supp}(f) - t_{n}]. $$ As $ (t_{n})_{n \in \mathbb{N}} $ is a bounded sequence and $ \text{supp}(f) $ is compact, it follows that $ E $ is a bounded measurable subset of $ \mathbb{R} $. By the Triangle Inequality, we obtain $$ \forall n \in \mathbb{N}: \quad |f - f_{t_{n}}| \leq |f| + |f_{t_{n}}| \leq |f| + M \cdot \chi_{E} \in {L^{1}}(\mathbb{R}), $$ which shows that $ (f - f_{t_{n}})_{n \in \mathbb{N}} $ is dominated by a single $ {L^{1}} $-function. The continuity of $ f $ ensures that $ (f - f_{t_{n}})_{n \in \mathbb{N}} $ converges pointwise to the zero function, so applying Lebesgue’s Dominated Convergence Theorem, we get $$ \lim_{n \to \infty} F(t_{n}) = \lim_{n \to \infty} \int_{\mathbb{R}} (f - f_{t_{n}}) \, d{\mu} = \int_{\mathbb{R}} 0 \, d{\mu} = 0. $$

To solve the original problem, use the fact that $ {C_{c}}(\mathbb{R}) $ is dense in $ ({L^{1}}(\mathbb{R}),\| \cdot \|_{{L^{1}}(\mathbb{R})}) $. You can either take this fact for granted, or you can prove it using both Urysohn’s Lemma (or rather, a special case of it) and the more fundamental fact that $ \mu $ is a regular measure on $ (\mathbb{R},\mathscr{B}(\mathbb{R})) $.