I'm familier with the "Spectral theorem" and its results, but I have a problem to prove the reverse: Let $T$ be a linear transformation in inner product space V with finite dimension, and let $\lambda_1,.....,\lambda_k$ be different scalars. Also: $P_1,...,P_2$ are linear transformations such that:
$T=\lambda_1P_1+....+\lambda_kP_k$.
$I=P_1+....+P_k$
$0=P_iP_j$; $P_i^2=P_i$; $P_i^*=P_i$
I need to prove that 1) $\lambda_1,.....,\lambda_k$ are ll eigen valus of T. 2)$P_1,...,P_k$ are all projection onto the eigenspaces of eigenvalus $\lambda_1,.....,\lambda_k$. 3)$T$ is normal when $F=C$ and symetric when $F=R$.
1)$P_i^*=P_i$ so $P_i$ is diagonalize. thus, $P_i$ has eigenvalue $\alpha_i$, I dont know how to conclude from here that $\lambda_1,.....,\lambda_k$ are ll eigen valus of T. 2)? 3)?
I really need help.Thanks.
I don't see how to rule out that $P_i = 0$ for some $i$, so I will assume that they are not null.
The hypotheses in the third line show that the transformations $P_i$ are orthogonal projections with pairwise orthogonal images. Indeed, from $P_i^2 = P_i$ you have that any $v \in V$ can be written as $v = P_iv + (v - P_iv)$, with $P_iv \in \operatorname{Im}(P_i)$ and $(v - P_iv) \in \ker(P_i)$ and $P_iP_j = 0$ and if $v = P_iv$, $u = P_ju$, then $$\langle v, u\rangle = \langle P_iv, P_ju\rangle = \langle v, P_iP_ju \rangle = \langle v, 0\rangle = 0. $$
The hypothesis in the second line shows that $V$ is in fact direct sum of these images, as for any vector $v \in V$, $v = Iv = P_1v + \ldots + P_kv$.
So for $v \in \operatorname{Im}(P_i)$, $v \neq 0$, we have that $v = P_iv$ and $Tv = (\lambda_1P_1 + \ldots + \lambda_kP_k)P_iv = \lambda_1P_1P_iv + \ldots + \lambda_kP_kP_iv = \lambda_i v$.
This answers the first and the second questions.
For the third, you have to observe that $P_i^* = P_i$ implies that $P_i$ is normal and that linear combinations of normal operators are normal.