general spectral theory for compact self-adjoint operators

83 Views Asked by At

If $T$ is a compact self -adjoint operator on Hilbet space $H$ and $\{\lambda_1\cdots\}$ is the set of eigenvalues of $T$, then $T=\sum_{i=1}^{\infty}\lambda_i P_i$, where $P_i$ is the orthogonal projection from $H$ to $\ker(T-\lambda_i)$.

There is a general spectral theorem (see Conway's book): if $T$ is a normal operator on $H$, then there exists a spectral measure $\mu$ on the borel subsets of $\sigma(N)$ such that $T=\int z \mathrm{d}\mu(z)$.

If $T$ is compact and self-adjoint, how to use the general spectral theorem to conclude that $\mu(\lambda_i)=P_i$, where $\lambda_i$ is the eigenvalue of $T$.