Show that if $T$ is a self adjoint linear operator on a Hilbert space such that the spectrum contains a single point $\lambda$, then $T=\lambda I$. Then, show this is false if $T$ is not self adjoint.
I am very unsure of my answer for part 1. Can you verify it?
Since $T$ is self adjoint, the residual spectrum is empty, so $\lambda\notin \sigma_r(T)$. This means either $\lambda \in \sigma_p(T)$ (point spectrum) or $\lambda \in \sigma_c(T)$ (continuous spectrum).
Note: Since $T$ is self adjoint, $\langle Tu_n,v \rangle=\langle u_n,Tv \rangle$ and $\lambda$ will be real, so $\langle \lambda u_n,v \rangle=\langle u_n,\overline \lambda v \rangle=\langle u_n,\lambda v \rangle$.
Case 1: $\lambda \in \sigma_p(T)$
Then $(T-\lambda I)u=0$ for some nontrivial $u$. S for all $v\in\mathcal H$, $$\langle Tu,v \rangle-\langle \lambda u,v \rangle =\langle u,Tv \rangle-\langle u,\lambda v \rangle $$ so $$\langle(T-\lambda I)u,v \rangle=\langle u,(T-\lambda I)v \rangle=0 \forall v\in \mathcal H$$
This implies that $T=\lambda I$
Case 2: $\lambda \in \sigma_c(T)$
Then for every $u\in \overline{R(T-\lambda I)}, u\notin R(T-\lambda I)$ there exists a sequence $u_n\to u$ in $\mathcal H$ such that $\langle(T-\lambda I)u_n,v>=0$ for all $v\in\mathcal H$. This means $$\langle (T-\lambda I)u_n,v \rangle=\langle u_n,(T-\lambda I)v \rangle \to \langle u,(T-\lambda I)v \rangle=0 \forall v\in \mathcal H$$ This implies $T=\lambda I$.
This completes the first part. However, I am unsure about my proof regarding the continuous spectrum.
For part 2, consider the non-self adjoint matrix
$$ \left (\begin{array}{cc} 2 & 0 \\ 1 & 2 \end{array}\right )$$
In finite dimensions, the continuous and residual spectra are empty. Also, it is easy to see that the only eigenvalue is 2. Thus this operator has a single point in its spectrum, and this matrix is clearly not equal to $2I$.