Consider the eigenvalue equation $$ \tag 1 \Delta f + V f = \lambda f$$ Where $\Delta+V$ is defined over $D=\{f\in L^2(\mathbb{R}^3) \mid \Delta f\in L^2(\mathbb{R}^3) \}$. $V:\mathbb{R}^3\to \mathbb{R}$, can be assumed to lie in $L^2(\mathbb{R}^3)+L^{\infty}(\mathbb{R}^3)$, indeed, a theorem by Kato ensures in this case that $\Delta+V:D\to L^2(\mathbb{R}^3)$ is self-adjoint on $D$.
I found this statement in some* books about Quantum Mechanics:
if $f\in L^{\infty}(\mathbb{R}^3)\setminus L^2(\mathbb{R}^3)$ solves $(1)$, then $\lambda\in \sigma(\Delta+V)$ belongs to the continuous spectrum.
How can we prove this?
Furthermore
does this hold if instead of $\Delta+V$ we have a generic self-adjoint operator densely defined over $L^2(\Omega)$? What happens if we require only that $\langle f,\varphi\rangle<\infty \forall \varphi\in \mathcal{S}(\Omega)$ the Schwartz space?
(My try) First of all I think that one should assume that any solution of $(1)$ for that $\lambda$ lies not in $L^2$, otherwise we would have non trivial kernel and so $\lambda$ would be an eigenvalue. Given that, I would like to prove that $\Delta+V$ is not closed. Therefore I would like to find a sequence of $f_n\in L^2(\mathbb{R}^3), \ \|f_n\|_{L^2}=1$ such that $\|(\Delta + V)f_n\|\to 0$. For example finding a sequence $\|f_n-f\|_{L^2}\to 0 $, but I don't know any density theorem of such a sort.
*e.g. Faddeev-Yakubovsky "Lectures on Quantum Mechanics for Mathematics Students", section 31 "The radial Shroedinger Equation".