I'm trying to solve the following exersice.
Let $A$ be a $\nu\times\nu$ matrix with elements over a field $F$ and let $\chi_A(x)=(-1)^\nu(\lambda_1-x)^{\sigma_1}\dots(\lambda_k-x)^{\sigma_k}$ be the characteristic polynomial of $A$, with $\lambda_i$ distinct from each other. Now if there exist unique $E_1,E_2,\dots,E_k$ $\nu\times\nu$ matrices such that:
- $A=\lambda_1E_1+\dots+\lambda_kE_k$.
- $E_i^2=E_i$ for $1\leq i\leq k$.
- $E_iE_j=0$ if $1\leq i,j\leq k$, $i\neq j$.
- $E_1+E_2+\dots+E_k=I_\nu$.
I want to prove that $rank(E_i)=\sigma_i$ for $1\leq i\leq k$.
I have started the solution like this. We know from a theorem that the matrix $A$ is diagonalizable. We know that the dimension of each of the $k$ eigenspaces of $A$ is $\sigma_i$, i.e. $dimV(\lambda_i)=\sigma_i$. Next, we observe that,
$$A E_i=\lambda_iE_i$$
for every $1\leq i\leq k$ and if we denote by $E_i^{(\mu)}$ the $\nu$ columns of $E_i$ it is easy to show that each, non-zero, column $E_i^{(\mu)}$ is an eigenvector of $A$ with eigenvalue $\lambda_i$. So, we just need to prove that the non-zero columns of $E_i$ are linearly independent, and the proof will be finished.
At the moment I can't show that. Any help?
The equation $AE_i = \lambda_i E_i$ shows that any (non-zero) vector in the column space of $E_i$ is an eigenvector of $A$ associated to the eigenvalue $\lambda_i$. Hence $\operatorname{rank}(E_i) \leq \sigma_i$ for all $1 \leq i \leq k$. Assume that $\operatorname{rank}(E_j) < \sigma_j$ for some $1 \leq j \leq k$. Then
$$ n = \operatorname{rank}(I) = \operatorname{rank}(E_1 + \dots + E_k) \leq \sum_{i=1}^n \operatorname{rank} (E_i) < \sum_{i=1}^n \sigma_i = n $$
and we obtain a contradiction.