Spectral theorem of compact operators in Hilbert space

2k Views Asked by At

I am reading the following theorem from my lecture notes (English translation of German text). But I don't understand exactly what is meant from this theorem and the proof.

Theorem.

Let $H$ be a Hilbert space and $T \in K(H)$ where $K(H)$ denotes the set of compact operators from $H$ to $H$. Let $T$ be normal or self adjoint depending on whether the underlying field is $\mathbb R$ or $\mathbb C$, then there exists an orthonormal set $\{e_i \mathrel| i\in I\}$ where $I$ is either $\mathbb N$ or $\{1,2,\dotsc,k\}$ and a sequence $(\lambda_i)_{i\in I} \in \mathbb K$ which converges to $0$ such that

$$\operatorname{span}{(x_i : i ∈ I})^{⊥} = \ker(T)$$

and also $\forall x \in H$ : $Tx= \sum_{i\in I} \lambda_i \langle x, e_i \rangle e_i$ with unconditional convergence.

I am sorry here because I don't understand the theorem completely which means the translation is wrong, but can someone point out this theorem in some textbook or somewhere.

Thanks.

1

There are 1 best solutions below

0
On

This is about the spectral decomposition of compact operators.

According to the theorem, a (normal / self-adjoint) compact operator can have countably many different eigenspaces, each of finite dimension, which are mutually orthogonal (due to self-adjointness / being normal). Let the eigenvectors be denoted by $e_1,e_2,e_3,...$ and the corresponding eigenvalues $\lambda_1,\lambda_2,..$ Moreover, if it has infinitely many eigenvalues, then these must tend to $0$.

Additionally, the operator must vanish outside the span of its eigenvectors.