Consider the heat equation on $\mathbb{R}$ $$ u_t=u_{xx} $$ with boundary conditions $u(0,x)=g(x)$.
It is well-known that even if the function $g$ is "very bad'' (say, only bounded but not continuous), the function $u(t,\cdot)$ would be from class $C^{\infty}$even for very small $t$.
My question is whether it is possible to quantify this estimate. Namely, assuming that $g\in C^\gamma$ for $\gamma\in(0,1)$, I wonder whether one can prove something like this $$ \|u(t,\cdot)\|_{C^1}\le t^{-\lambda}\|g\|_{C^\gamma} $$ for some $\lambda>0$?
Thanks!
Yes, you can. But things are not completely elementary. You need the following ingredients:
1) The second derivative generates a (not strongly continuous!) semigroup on $L^\infty(\mathbb R)$
2) This semigroups is analytic and hencce maps the ambient space into the domain of any power of its generator.
3) Such domains are known to be imbeddable in spaces of Hölder functions and hence the semigroup induces "restricted" semigroups on $C^\alpha(\mathbb R)$, for which your estimates can then be proved to hold.
You can find all this (and much more) in Chapter 3 of Lunardi's nice monograph Analytic Semigroups and Optimal Regularity in Parabolic Problems (Birkhäuser 1995).