My question is regarding the proof of Proposition 2.5 (p.74) in Stein & Shakarchi's Real Analysis:
Proposition 2.5: Suppose $f \in L^1(\mathbb{R}^d)$. Then $$\|f_h - f \|_{L^1} \rightarrow 0 \quad \textit{as } h \rightarrow 0.$$
The proof is a simple consequence of the approximation of integrable functions by continuous functions of compact support as given in Theorem 2.4. In fact for any $\epsilon > 0$, we can find a [continuous and compactly supported] function $g$ so that $\|f - g\| < \epsilon.$ Now $$f_h - f = (g_h - g) + (f_h - g_h) - (f-g).$$
However, $\|f_h - g_h\| = \|f-g\| < \epsilon$, while since $g$ is continuous and has compact support we have that clearly $$\|g_h - g \| = \int_{\mathbb{R}^d} |g(x-h) - g(x)|\,dx \rightarrow 0 \text{ as } h \rightarrow 0.$$ So if $|h| < \delta$, where $\delta$ is sufficiently small, then $\|g_h - g\| < \epsilon$, and as a result $\|f_h-f\| < 3\epsilon$, whenever $|h| < \delta$.
My question: I'm unsure about this step: "We have that clearly
$$\|g_h - g \| = \int_{\mathbb{R}^d} |g(x-h) - g(x)|\,dx \rightarrow 0 \text{ as } h \rightarrow 0$$
..."
Why must this be true?
My thought was to let $$M(h) := \max\limits_{x \in \text{supp}(g)} |g_h(x) - g(x)|$$ and then argue that
- $M(h)$ is defined for all $h$ because $\text{supp}(g)$ is compact, and then
- $M(h) \rightarrow 0$ as $h \rightarrow 0$ (since we clearly have $g_h \rightarrow g$ pointwise).
Is this the right idea? It feels like we are using uniform convergence, but in this case $\{g_h\}_{h \in \mathbb{R}}$ is not a sequence of functions...is it accurate to say that $g_h \rightarrow g$ uniformly as $h \rightarrow 0$? Or is there a different terminology for this type of convergence (aside from pointwise)?
Regarding the notation: $g_h(x) := g(x-h)$ (the translation of $g$ by $h$), and the authors use $\| \cdot \|$ (without the subscript) to denote the $L^1$ norm.
Update 1/2/22: I believe I've figured out my question, which came down to proving that $g_h \rightarrow g$ in $L^1$ as $h \rightarrow 0$ for continuous, compactly supported functions. The proof I've come up with is based on Egorov's Theorem, and is given below:
Lemma: If $g: \mathbb{R}^d \rightarrow \mathbb{C}$ is continuous and supported on a compact set $A \subset \mathbb{R}^d$, then $g_h \rightarrow g$ in $L^1$ as $h \rightarrow 0$.
Proof: Let $\epsilon > 0$, and let $\{h_n\}_{n=1}^{\infty}$ be a sequence in $\mathbb{R}^d$ such that $|h_n| \rightarrow 0$ as $n \rightarrow \infty$. Since $g$ is continuous, we clearly have $\lim\limits_{h \to 0} g_h(x) = g(x)$ for all $x$. Therefore, the sequence of functions $\{g_{h_n}(x)\}_{n=1}^{\infty}$ converges pointwise as to $g(x)$ for all $x$. Also, $g$ and $\{g_{h_n}\}_{n=1}^{\infty}$ are all measurable functions because continuous functions are measurable (Property 2, p.29). Then by Egorov's Theorem (Theorem 4.4, p.33), there exists a compact set $K_{\epsilon} \subset A$ such that $m(A \setminus K_{\epsilon}) \leq \epsilon$ and $g_{h_n} \rightarrow g$ uniformly as $n \rightarrow \infty$. It follows by the uniform convergence of the $g_{h_n}$'s that there exists an integer $N > 0$ such that $$ n \geq N \implies \sup_{x \in A} |g_{h_n}(x) - g(x)| \leq \epsilon .$$
Furthermore, because $\sup_{x \in A} |g_{h_n}(x) - g(x)| \leq \sup_{x \in A} 2|g(x)|$ and the fact that a continuous function on a compact set is bounded, there exists some $M > 0$ such that $\sup_{x \in A} |g_{h_n}(x) - g(x)| \leq M$. It follows that for $n \geq N$ we have \begin{align*} \int_{A} |g_{h_n}(x) - g(x)|\,dx &= \int_{A} |g_{h_n}(x) - g(x)|\,dx + \int_{A \setminus K_{\epsilon}} |g_{h_n}(x) - g(x)|\,dx \\[5pt] &\leq m(A) \epsilon + \epsilon M \\[5pt] \end{align*}
Then since $\epsilon > 0$ was arbitrary (and $M$ and $m(A)$ are independent of $\epsilon$), we have $$\lim_{n \to \infty} \int_A |g_{h_n}(x) - g(x)|\,dx = 0.$$
And since $\{h_n\}_{n=1}^{\infty}$ was an arbitrary sequence converging to $0 \in \mathbb{R}^d$, it follows that $$\lim_{h \to 0} \int_A |g_h(x) - g(x)|\,dx = 0, $$ i.e. $g_h \rightarrow g$ in $L^1$ as $h \rightarrow 0$, as desired. $\quad \square$