Finding a bound for the first derivative of a C^2 function within the p-norm

188 Views Asked by At

Let $u\in C^2([0,1])$ and $\epsilon >0$ arbitrary. Show that there exists an absolute constant C>0 such that

$$ \int_0^1 |u'|^p dx \leq \epsilon^p \int_0^1 |u''|^p dx + \frac{C}{\epsilon^p} \int_0^1 |u|^p dx. $$

Hint. Consider an interval $(a,b)\subset(0,1)$ of length $\epsilon$ (same as above). Use the mean value theorem for a bound of $|u'(x)|$ for some $x\in(a,b)$, depending on $\epsilon, u(x_1), u(x_2), x_1, x_2\in(a,b)$. Then use the fundamental calculus theorem on $u'$ to obtain an estimate of $|u'(x)|$ for any $x\in(a,b)$. Integrate this estimate with respect to $x_1,x_2$, raise everything to power p and apply Hölder's inequality to obtain an estimate of $|u'(x)|$ in terms of the integral quantities on the RHS of the desired result.

2

There are 2 best solutions below

0
On

You are almost on the right track. Following the hint, we find $\hat x \in (a,b)$ with $b - a = \epsilon$, such that $$u'(\hat x) = \frac1\epsilon \, (u(b) - u(a)).$$ By the fundamental theorem of calculus, you find $$u'(x) = \int_{\hat x}^x u''(t) \, \mathrm{d}t + u'(\hat x) = \int_{\hat x}^x u''(t) \, \mathrm{d}t + \frac1\epsilon \, (u(b) - u(a)).$$ In particular, we get the estimate $$|u'(x)| = \int_a^b |u''(t)| \, \mathrm{d}t + \frac1\epsilon \, |u(b) - u(a)| \quad \forall x \in [a,b].$$ Now, you can set $a = x$, $b = x+\epsilon$, use some elementary calculations, and integrate from $0$ to $1-\varepsilon$ (the remaining part can be handled analogously by $a = x-\varepsilon$, $b = x$).

0
On

Got it: Let $(a,b)\subset (0,1)$ with $b-a=\epsilon$. If we choose $x_1\in(a,a+\epsilon/3), x_2\in(b-\epsilon/3,b)$, then there exists $x_o\in(a,b)$ satisfying $$|u'(x_o)|\leq\frac{3}{\epsilon}(|u(x_1)|+|u(x_2)|).$$ The fundamental theorem of calculus implies $$u'(x)=\int_{x_o}^{x}u''(t)dt + u'(x_o)\\|u'(x)|\leq\frac{3}{\epsilon}(|u(x_1)|+|u(x_2)|)+\int_{a}^{b}|u''(x)|dx.$$ Integrating with respect to $x_1, x_2$ yields $$|u'(x)|\leq\frac{18}{\epsilon^2}\int_{a}^{b}|u(x)|dx+\int_{a}^{b}|u''(x)|dx.$$ Raising both sides to power p and applying Hölder, one gets $$|u'(x)|^p\leq2^{p-1}(\frac{18^p}{\epsilon^{p+1}}\int_{a}^{b}|u(x)|^pdx+\epsilon^{p-1}\int_{a}^{b}|u''(x)|^pdx).$$ As suggested by the hint, integrate over $(a,b)$ with respect to $x$: $$\int_{a}^{b}|u'(x)|^pdx\leq2^{p-1}(\frac{18^p}{\epsilon^{p}}\int_{a}^{b}|u(x)|^pdx+\epsilon^{p}\int_{a}^{b}|u''(x)|^pdx).$$ The desired result is obtained by dividing $(0,1)$ into subintervals of arbitrary length $\epsilon>0$ and successively adding the above inequality.