Let $\Omega$ be a bounded domain with regular boundary. Assuming that functions $C_c^{\infty}(\Omega)$ are dense in $H_0^2(\Omega)$ (wwith the norm in $H^2(\Omega)$, I need to show that this is actually a norm. $$||u||_{h_0^2(\Omega)}:=\left(\int_{\Omega}|\Delta u|^2(x)dx\right)^{\frac{1}{2}}$$ Thanks in advance.
How to prove this is a norm in $H_0^2(\Omega)$?
346 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
You just need to find that for every $f,g\in H^2_0(\Omega)$, for every $\lambda\in\mathbb{R},$ the following properties hold.
Absolute homogeneity. Here we use the fact that $\Delta$ and the integral are linear operators. Moreover that $\sqrt{\lambda^2}=|\lambda|$ $$ \|\lambda f\|=\left(\int_\Omega |\Delta (\lambda f)|^2\right)^{\frac{1}{2}}=|\lambda|\left(\int_\Omega |\Delta (\lambda f)|^2\right)^{\frac{1}{2}}=|\lambda\|f\| $$ Triangular inequality. It follows immediately by Minkowsky inequality
$$ \|f+g\|=\left(\int_\Omega |\Delta (f+g)|^2\right)^{\frac{1}{2}}\le \left(\int_\Omega |\Delta f|^2\right)^{\frac{1}{2}}+\left(\int_\Omega |\Delta g|^2\right)^{\frac{1}{2}}= \|f\|+\|g\| $$
Zero vector. You should check that whenever $\|f\|=0$, then $f=0$ a.e. in $\Omega$. Observe that $f$ satisfies the Dirichlet problem
$$ \begin{cases} \Delta f=0\qquad\text{in}\; \Omega\\ f=0 \qquad\text{on}\; \partial\Omega \end{cases} $$ By performing integration by parts $$ \int_\Omega |\nabla f|^2=-\int_\Omega (\Delta f ) f=0 $$ hence $f=const$ a.e. in $\Omega$. Because of the boundary conditions, we conclude $f=0$ a.e. in $\Omega$.
(I hope you can fill the gaps in the proof)
See here if you don't know which properties I'm referring to
Absolute homogeneity: For $\lambda \in \mathbb{C},$ we have $$\|\lambda u\|_{h_{0}^{2}(\Omega)} = \|\Delta(\lambda u)\|_{L^{2}(\Omega)} = \|\lambda (\Delta u)\|_{L^{2}(\Omega)} = |\lambda| \|\Delta u\|_{L^{2}(\Omega)}= |\lambda|\|u\|_{h_{0}^{2}(\Omega)}$$ where I've used the abosolute homogeneity of the $L^{2}(\Omega)$-norm.
Triangle Inequality: \begin{align*}\|u+v\|_{h_{0}^{2}(\Omega)} &= \|\Delta( u+v)\|_{L^{2}(\Omega)} = \|\Delta u + \Delta v\|_{L^{2}(\Omega)} \leq \|\Delta u\|_{L^{2}(\Omega)} + \|\Delta v\|_{L^{2}(\Omega)} \\ &= \|u\|_{h_{0}^{2}(\Omega)} + \|v\|_{h_{0}^{2}(\Omega)} \end{align*} where I used the triangle inequality on $L^2(\Omega)$ (also known as Minkowski's inequality).
Finally, let $\|u\|_{h_{0}^{2}(\Omega)}= 0$. This implies $\Delta u = 0$. We have to show that this implies $u=0$. By partial integration, we obtain
$$0 = - \int_{\Omega} \Delta u \cdot u ~\mathrm{d}x = \int_{\Omega}|\nabla u|^2 \mathrm{d}x$$ which shows that $\nabla u = 0$. By the Poincaré inequality, we see that the (classical) sobolev norm of $u$ is $0$, and therefore $u$ is $0$.