Energy for Inhomogeneous Heat Equation

272 Views Asked by At

Suppose $V(x,t)$, $x\in \mathbb{R}^n$, $t\geq 0$ is continuous such that $V(x,t)\geq \epsilon >0$. Now I want show that for any solution $u$ of the $\partial_t u+\Delta u + V(x,t)=0$, the energy integral $$ \int_{\mathbb{R}^n} |u(x,t)|^2 \; \text{d}x$$ must decay exponentially in time.

I really have no idea how to show this. A few hints would be appreciated deeply.

1

There are 1 best solutions below

1
On

I'm assuming you meant $$u_t - \Delta u + V(x,t) = 0,$$ otherwise I don't think the result is true.

In this case use the variation of parameters formula and plug the Heat Kernel into the resulting expression. Then use standard estimates (I think the inequalities of Young and Jensen in this case) to estimate the energy integral. Since you asked for a hint, I'll leave it at that. Let me know if you need a detail or two.