How to prove that there exists a constant C such that $\max |f|^{2} \le C \int \left ( |f'|^{2} + f^{2} \right ) dx$?

202 Views Asked by At

There is an inequality used in the paper written by Gage and Hamilton, but I am not sure how it comes. It states that there exists a constant $C$ such that on an interval $I$

$\max |f|^{2} \le C \int_{I} \left ( |f'|^{2} + f^{2} \right ) dx$

where $f(x)$ may not be continuous on $I$.

2

There are 2 best solutions below

11
On BEST ANSWER

What is intended is surely some version of an instance of a Sobolev imbedding theorem, namely, that in one dimension the $H^1$ norm dominates the sup-norm. It suffices to prove this on test functions, where the derivative is literal, since test functions are dense in both. Then the sense of $f'$ has to be extended to the $L^2$ sense for it to make sense for $f$ not necessarily classically differentiable.

In that vein, by the contrapositive, if $f$ is not continuous, it cannot lie in $H^1$, so the right-hand side is $+\infty$ under any reasonable interpretation. So the inequality is vacuously true in that case.

0
On

In fact your question deals with Sobolev space $H^1$. Have a look at the proof of Theorem of Rellich-Kondrachev in this paper.