Prove continuity in the $p$-mean / Lebesgue Points

179 Views Asked by At

Lemma: Let $p \in [1, \infty)$, $I := (a,b)$ a real interval and $u \in L^p(I)$ arbitrary. Then we have \begin{equation*} \forall \varepsilon > 0 \ \exists \delta > 0: | h | < \delta \implies \left( \int_{a}^{b} | u(x + h) - u(x) |^p \right)^{\frac{1}{p}} < \varepsilon \end{equation*} where, outside of $I$ $u$ is trivially continued with 0.

My Questions

  1. Why is the trivial continuation possible? Is it smooth? If $u$ had compact support on $I$, this is always possible, but that's not the case.
  2. How would one prove this? This question is much more general, ($\mathbb{R}^d$ instead of $I$). And from the wikipedia proof of the Lebesgue differentiation theorem, which is wasn't able to follow, I assume I have to consider something like \begin{equation*} E_{\alpha} := \left\{x \in I: \limsup_{\substack{x \in I \\ b \searrow a}} \frac{1}{b - a} \left| \int_{a}^{b} u(y) dy - u(x)\right| > 2 \alpha \right\} \end{equation*} and show it has measure 0 for all $\alpha > 0$ and use that continuous compactly support functions are dense.

We have already shown a similar result way earlier, but since $u$ need not be uniformly continuous, I don't know how to recycle this proof for the proof of the lemma above.

Theorem Let $X$ be a real Banach space and $u \in \mathcal{C}([a,b]; X)$ a function. Then we have

  1. $\lim_{h \to 0} \frac{1}{h} \int_{t}^{t + h} u(s) ds = u(t)$ for all $t \in [a,b]$.
  2. $\lim\limits_{h \to 0} \frac{1}{h} \int_{t}^{t + h} \| u(s) - u(t) \| ds = 0$ for all $t \in [a,b]$.
  3. $\lim\limits_{h \to 0} \int_{a}^{b} \| u(t + h) - u(t) \| dt = 0$, where $u$ is trivially continued with 0.

Proof 1. follows from 2: \begin{align*} \left\| \frac{1}{h} \int_{t}^{t + h} u(s) ds - u(t) \right\|_{X} & = \left\| \frac{1}{h} \int_{t}^{t + h} u(s) ds - \int_{t}^{t + h} u(t) ds \right\|_{X} \\ & \le \frac{1}{| h |} \int_{\min(t, t +h)}^{\max(t, t + h)} \| u(s) - u(t) \|_X ds \xrightarrow{2.} 0. \end{align*}

  1. Because $u$ is continuous for an arbitrary $t \in [a,b]$, we have $$ \forall \varepsilon > 0 \ \exists \delta > 0: \| u(t) - u(s) \| < \varepsilon \ \forall s \in [a,b] \text{ with } | s - t | \le \delta. $$ Choose $|h| < \delta$. We conclude $$ \frac{1}{h} \int_{t}^{t + h} \underbrace{\| u(s) - u(t) \|}_{< \varepsilon} ds < \varepsilon. $$

  2. $u$ is uniformly continuous: $$ \forall \tilde{\varepsilon} > 0 \exists \tilde{\delta} > 0: \| u(t) - u(s) \| \le \tilde{\varepsilon} \ \forall s,t \in [a,b] \text{ with } | s - t | \le \delta. $$ For $|h| < \tilde{\delta}$ (and w.l.og. $h > 0$) we have \begin{align*} \int_{a}^{b} \| u(t + h) - u(t) \| dt &= \int_{a}^{b - h} \| u(t + h) - u(t) \| dt + \int_{b - h}^{b} \| 0 - u(t) \| dt \\ & \le \tilde{\varepsilon} (b - a) + \|u \|_{\infty} \cdot h. \end{align*}

Let $\varepsilon > 0$ and set $|h| < \min\left\{ \frac{\varepsilon}{2 \| u \|_{\infty}}, \tilde{\delta} \right\}$ (w.l.og. $u \not\equiv 0$) and choose $\tilde{\delta}$ with respect to $\tilde{\varepsilon} := \frac{\varepsilon}{2(b - a)}$. $\square$.

1

There are 1 best solutions below

1
On BEST ANSWER

For $h\in\mathbb R$ let $T_h : L^p(\mathbb R)\to L^p(\mathbb R)$ be defined by $T_hu(x) = u(x+h)$. Note that $\|T_hu\|_p = \|u\|_p$ for all $u\in L^p(\mathbb R)$.

Now, let $u\in L^p(\mathbb R)$ be fixed and let $\epsilon>0$ be given. Then we find $\phi\in C_0^\infty(\mathbb R)$ such that $\|u-\phi\|_p < \epsilon/3$. Hence, $$ \|T_hu - u\|_p\,\le\,\|T_h(u-\phi)\|_p + \|T_h\phi - \phi\|_p + \|\phi-u\|_p < \frac 23\epsilon + \|T_h\phi - \phi\|_p. $$ Now, by what you have already shown for uniformly continuous $u$ we find $\delta > 0$ such that $$ \|T_h\phi - \phi\|_p = \left(\int|\phi(x+h)-\phi(x)|^p\,dx\right)^{1/p} < \frac\epsilon 3 $$ for $|h| < \delta$. Thus, for these $|h|<\delta$ we have $\|T_hu - u\|_p < \epsilon$.