Inequality for spectral families of self-adjoint operators

47 Views Asked by At

Let $A,B$ be self-adjoint operators on a Hilbert space $\mathcal{H}$ with $D(A) \subset D(B)$ and $A \leq B$. By the spectral theorem, we write :

$A = \int_{\mathbb{R}} \lambda \,E^A(d\lambda) \hspace{1cm} B = \int_{\mathbb{R}} \lambda \, E^B(d\lambda)$

Does one have : $E^A([n,+\infty[) \leq E^B([n, +\infty[)$ ?

If we take some $\psi \in D(A) \cap R\left(E^A([n,+\infty[)\right) $ ($R$ is the range), then $n \| \psi\| \leq \langle \psi \vert A \psi \rangle \leq \langle \psi \vert B \psi \rangle $ so $\psi \in D(A) \cap R\left(E^B([n,+\infty[)\right) $. But this isn't enough, we need $R\left(E^A([n,+\infty[)\right) \subset R\left(E^B([n,+\infty[)\right)$ to get the result, and I don't see how to get this (I tried to extend by density but it doesn't seem to work). Maybe the inequality just isn't true.