On a bounded smooth domain in $\mathbb{R}^n$ consider the equation $$-\Delta u + ku = f$$ $$\partial_\nu u = 0$$where $k>0$ is a constant and $f \in L^2(\Omega)$.
We know that $u \in H^2(\Omega)$.
Is it possible that $u$ has the regularity $u \in L^\infty(\Omega)$?
If $f \in L^\infty(\Omega)$, then this is true (see eg http://www.math.tu-berlin.winkert.de/publications/papers/PreprintNODEA2010.pdf). However under the $L^2$ assumption only I am not sure. I had thought there would be $L^2-L^\infty$ smoothing effect?