Mistake in a PDE book regarding Lebesgue's differentiation theorem? To do with weak formulation

93 Views Asked by At

I'm reading "Elliptic and Parabolic Equations" by Wu, Yin and Wang. In Section 4.2, they consider the heat equation given $u_0 \in L^\infty$ and $f \in L^\infty$ $$u_t - \Delta u = f$$ $$u(0) = u_0$$ where $u \in H^1(0,T;L^2) \cap L^2(0,T;H^1)$ on a bounded domain. Testing the equation with $(u-k)^+$, where $k$ is larger than the $L^\infty$ norm of $u_0$, and using the fact that $I_k(t) = \int_\Omega |(u(t)-k)^+|^2$ is continuous, so it has a maximum at $t=\sigma$, they derive the following:

enter image description here

This I don't understand. Here $\sigma$ is NOT a variable, but a particular point in $[0,T]$ which is such that $I_k(\sigma) \geq I_k(t)$ for all $t$ as I mentioned.

They seem to have used Lebesgue's differentiation theorem to obtain the final inequality but that works only for almost all $t \in [0,T]$, and $\sigma$ may lie in the null set. Is there some other way to derive this?