Consider the heat equation: $$u_t-u_{xx}=0$$ on $(t,x)\in(0,\infty)\times(0,l)$, with conditions $u(0,x)=\frac{1}{l^2}x(1-x)$ and $u(t,0)=u(t,1)=0$.
I am trying to show that $u$ decays uniformly to $0$ (more precisely, $u(t,x)\leq x(1-x)e^{-t}$). An idea I have is to consider $v(t,x)=e^tu(t,x)$, which gives the equation $v_t-v_{xx}=v$ with the same conditions.
From here it seems that this problem can be solved by considering the infinite series solution. I would like to see if there's any alternative ways that are more elementary. This may be related as well; it shows that the $L^2$ norm of the solution decays exponentially.
Any help would be appreciated!
Notice that $v(t,x):=u(t,x)-x(1−x)e^{-t}$ is a subsolution of the heat equation, i.e., $\partial_t v - \partial_{xx} v\leq 0$. It can be shown that a heat-subsolution satisfies the weak maximum principle. It follows immediately from this and the given initial-boundary condition that $v(t,x)\leq 0$, and we're done.