An inequality using Sobolev norms

227 Views Asked by At

Let $\| \cdot \|_{H^s(\mathbb R)}$ be the usual Sobolev norm in $\mathbb R$ and $r>0$. If we have $$ \|f\|_{L^\infty(\mathbb R)} \le \| f\|_{H^k(\mathbb R)} $$ for all $k>r$, then the inequality $$ \|f\|_{L^\infty(\mathbb R)} \le \| f\|_{H^r(\mathbb R)} $$ still hold?

1

There are 1 best solutions below

1
On

This is not true. The $H^r$ norm controls the supremum of $f$ if and only if $2r>n$ where $n$ is the dimension of the space ($n=1$ in your case), by the Sobolev-Morrey embedding. In the borderline case $r=1/(2n)$ the embedding fails. This is typical of embeddings.

An example is easier to give in the periodic setting, on $\mathbb R/(2\pi \mathbb Z)$. Let $$f(x) = \sum_{n=2}^\infty \frac{1}{n \log n}\cos nx $$ Since $\sum_{n=2}^\infty \frac{1}{n \log n}$ diverges, $f$ is unbounded near $0$. On the other hand, $$ \|f\|_{H^{1/2}}^2 \approx \sum_{n=2}^\infty \frac{n}{n^2 \log^2 n} <\infty $$

To adapt this example to $\mathbb R$, you can multiply $f$ by $e^{-x^2}$, but the price to pay is the messier verification of $f\in H^{1/2}$.