Heat equation: proving that smaller diffusion leads to bigger solution via energy methods

59 Views Asked by At

Let $\Omega$ be a bounded Lipschitz domain and denote by $u_\alpha$ the solution of the heat equation $$u_t -\alpha \Delta u = f$$ with $f \in L^2(0,T;L^2(\Omega))$, $u(0) = u_0$ given and $u|_{\partial\Omega} = 0$.

We have $$u_\alpha \in L^2(0,T;H^1_0(\Omega) \cap H^2(\Omega)) \cap H^1(0,T;L^2(\Omega)).$$ If the diffusion constant gets smaller, then the solution should get bigger (in the sense that $u_\alpha(t) \geq u_\beta(t)$ for a.e. $t \in (0,T)$ whenever $\alpha \leq \beta$). How can I show this via energy methods or without using further regularity of the solution?