On a bound for the application of the dominated convergence theorem.

62 Views Asked by At

In class we are following the book by Hormander called Linear Partial Differential Operators.

It is stated that: Let $u$ be an integrable function and vanish outside a compact subset $K$ of $\Omega$ ( $\Omega$ is an open set in $R^n$ dimensional space), then $u_\epsilon \in C_0^{\infty}(\Omega)$ if $\epsilon$ is smaller than the distance $\delta$ from $K$ to $\Omega^c$.

Where $u_\epsilon := \int_{R^n} u(x-\epsilon y) \phi(y) \ dy$ and $\phi$ is a test function.

To prove the differentiability we wish to bring the limit as $h \rightarrow 0$ inside the integral, I wrote in my notes that the bound (In the case $n = 1$) would be

$$| u(y)\frac{\phi_\epsilon(x+h) - \phi_\epsilon(x)}{h} | \le 2 |u(y)| $$ where $\phi_\epsilon := \epsilon^{-n} \phi(x/ \epsilon) $ allows us to apply the lebesgue dominated convergence theorem to take the limit inside the integral, how was this bound found? A fellow student of mine suggested to me that $\phi_\epsilon \le 1$ but here we have an $h$ in the denominator.

EDIT: The bound that seems more natural to me is $$| u(y)\frac{\phi_\epsilon(x+h) - \phi_\epsilon(x)}{h} | \le |u(y)| \sup_h \{ \frac{\phi_\epsilon(x+h) - \phi_\epsilon(x)}{h}\} $$

and the supremum would exist and be finite since the test functions are continuous and compactly supported (and compositions of continuous functions are continuous).