Energy Method for Regularizing Effect of Heat Equation

500 Views Asked by At

I am trying to show the following: Let u solve the homogeneous heat equation in the cylinder $\Omega$ x $(0, \infty)$ with vanishing dirichlet data and initial condition g. Multiple the PDE by $tu_t$ and apply the energy method to show that $\int_{\Omega}| \nabla u(x,t)|^2dx < \frac{1}{t}\int_{\Omega}|g(x)|^2dx$.

I'm not sure how I apply the energy method once I have multiplied the PDE through by $tu_t$.

2

There are 2 best solutions below

7
On

Hint

One dimensional Heat equation ($\nabla u=u_x$)

$$u_t = \Delta u=cu_{xx}$$

Energy method, define $$E(t)=\int_{\Omega} u^2 dx $$

Due to vanishing dirichlet BC you get $$E'(t)=\int_{\Omega} 2 u u_t dx =\int_{\Omega} 2 u c u_{xx} dx = 2c u u_x\Big|_{\partial\Omega}-2c\int_{\Omega} u_x^2 dx =-2c\int_{\Omega} u_x^2 dx =\quad -2c\int_{\Omega}\|\nabla u\|^2 dx \le0 $$ You have also to explain here why you allowed to differentiate under the integral.

Since $E'\le0$, $E$ is decreasing and, together with
$$E(0)=\int_{\Omega} g^2 dx $$ we got $$E(t)\le \int_{\Omega} g^2 dx, \quad \forall t\ge0$$

So you got $$\int_{\Omega}\|\nabla u\|^2 dx= -\frac{1}{2c}E'(t)=\frac{1}{2c}\left|E'(t)\right|$$

I'm actually fail to see why it should be $< \frac{1}{t}\int_{\Omega}|g(x)|^2dx$

0
On

Not quite sure how to use the hint, but here are two possible solutions (compared with Michael Medvinky's post, I take $c=1$).

1) Spectral method: Let $$ -\Delta=\sum_{k=1}^\infty\lambda_k \langle\phi_k,\cdot\rangle\phi_k $$ be the spectral decomposition of the Laplacian with Dirichlet boundary conditions. The solution of the heat equation is given by $$ u(t,x)=\sum_{k=1}^\infty e^{-t\lambda_k}\langle \phi_k,g\rangle\phi_k(x). $$ Thus $$ \int_\Omega \lvert \nabla u(t,x)\rvert^2\,dx=-\int_\Omega u(t,x)\Delta u(t,x)\,dx=\sum_{k=1}^\infty \lambda_k e^{-2t\lambda_k}\lvert \langle\phi_k,g\rangle\rvert^2. $$ Since $\lambda e^{-2t\lambda}\leq (2et)^{-1}$ for $\lambda\geq 0$, we have $$ \int_\Omega \nabla u(t,x)\rvert^2\,dx\leq \frac{1}{2et}\sum_{k=1}^\infty \lvert\langle\phi_k,g\rangle\rvert^2=\frac{1}{2et}\int_\Omega g(x)^2\,dx. $$ If $g\neq 0$, we get the strict inequality claimed in the question (of course it's simply not true for $g=0$).

2) Energy method: Let $$E(t)=\frac 1 2\int_\Omega u(t,x)^2\,dx.$$ As shown in Michael Medvinky's post, the derivative of $E$ is given by $$ E'(t)=-\int_\Omega \lvert \nabla u(t,x)\rvert^2\,dx. $$ The second derivative of $E$ is $$ E''(t)=\frac{d}{dt}\int_\Omega u(t,x)\Delta u(t,x)\,dx=\int_\Omega (\Delta u(t,x))^2+u(t,x)\Delta^2 u(t,x)\,dx=2\int_\Omega (\Delta u(t,x))^2\,dx\geq 0. $$ Hence $E'(t)$ is increasing and we get $$ \frac1 2\int_\Omega g(x)^2\,dx=E(0)=E(t)-\int_0^t E'(s)\,ds\geq E(t)-t E'(t)\geq -tE'(t)=t\int_\Omega\lvert \nabla u(t,x)\rvert^2\,dx. $$ Again, you get the desired strict inequality from the question unless $g=0$.