There is an inequality used in the paper written by Gage and Hamilton, but I am not sure how it comes. It states that there exists a constant $C$ such that on an interval $I$
$\max |f|^{2} \le C \int_{I} \left ( |f'|^{2} + f^{2} \right ) dx$
where $f(x)$ may not be continuous on $I$.
What is intended is surely some version of an instance of a Sobolev imbedding theorem, namely, that in one dimension the $H^1$ norm dominates the sup-norm. It suffices to prove this on test functions, where the derivative is literal, since test functions are dense in both. Then the sense of $f'$ has to be extended to the $L^2$ sense for it to make sense for $f$ not necessarily classically differentiable.
In that vein, by the contrapositive, if $f$ is not continuous, it cannot lie in $H^1$, so the right-hand side is $+\infty$ under any reasonable interpretation. So the inequality is vacuously true in that case.