Sum of the spectra of two s-a operators

36 Views Asked by At

I am asking something a little more general than my previous question. It seems "trivial" but I cannot find this statement anywhere and I have self-doubts.

Let $L_1$ and $L_2$ be two self-adjoint operators on $L^2(\mathbb{R})$. Then we have the following

  1. $\min(\sigma(L_1)+\sigma(L_2))\leq \frac{\left<v,(L_1+L_2)v\right>}{||v||^2}\leq \max(\sigma(L_1)+\sigma(L_2))$
  2. $\sigma(L_1+L_2)\subset [\min(\sigma(L_1)+\sigma(L_2)),\max(\sigma(L_1)+\sigma(L_2))],$

where $<.,.>$ and $||.||$ denote the $L^2(\mathbb{R})$ inner product and norm.

Idea of proof. Since $L_i$ are self-adjoint, and given their spectra, we have that

  1. $\min(\sigma(L_1))\leq \frac{\left<v,L_1v\right>}{||v||^2}\leq \max(\sigma(L_1))$
  2. $\min(\sigma(L_2))\leq \frac{\left<v,L_2v\right>}{||v||^2}\leq \max(\sigma(L_2)).$

Then statement 1 in the theorem is implied from 1 and 2 above. Because $L_i$ are self-adjoint, statements 1 and 2 in the theorem are equivalent. If true, I would really be interested in a reference.

1

There are 1 best solutions below

3
On BEST ANSWER

The second part fails. Let $$L_1=\begin{pmatrix} 1 & 0\\ 0&0\end{pmatrix}\quad L_2=\begin{pmatrix} 0 & 1\\ 1&0\end{pmatrix}$$ Then $\sigma(L_1)=\{0,1\}$ and $\sigma(L_2)=\{-1,1\}.$ However $\sigma(L_1+L_2)=\{a,b\}$ where $a,b$ satisfy $x(1-x)=-1,$ hence they are irrational.

We can realize the example above on any Hilbert space of dimension greater or equal $2.$