I'm working on the proof of the Dunford decomposition theorem :
All matrices $A\in M_n(K)$ such their characteristic polynomials split can be written in the form $A=D+N$ where $D$ is diagonalizable and $N$ is nilpotent.
The proof :
Let $A \in M_n(K) $ a linear operator with spectrum $\sigma(A)=\{\lambda_1,...\lambda_r\}$. The characteristic polynomial of $A$ can be written : $$X_A(t)=\prod\limits_{i=1}^{r}(t-\lambda_i)^{m_i} \quad m_i\text{ is the algebraic multiplicity of }\lambda_i$$ Then, we know, thanks to the primary reduction theorem, that : $$\ker(X_A(A))=K_n=N_1 \oplus \cdots \oplus N_r$$ where $N_i=N_{\lambda_i}(A)=\ker(A-\lambda_iI_n)^{m_i}$
If we call $B_i$ a basis of $N_i$ far all $i \in \{1,...,r\}$, then $B=B_1 \cup \cdots \cup B_r$ is a basis of $K^n$. Here is the thing that I don't understant : why, in this basis, the matrix of $A$ will have this form :
\begin{pmatrix} A_1 & 0 & \dots & 0 \\ 0 & A_2 & \dots & 0 \\ \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & \dots & A_r \end{pmatrix}
Where $A_i, i\in \{1,...,r\}$ is a matrix with size $m_i\times m_i$ ???
It has this form because each $N_i$ is stable under $A$, so the basis vectors of $B_i$ are sent to combinations of vectors of $B_i$ : their components on $B_j, j\neq i$ are therefore $0$.
More generally, when $E$ is a vector space, $f$ an endomorphism and $F,W$ two stable subspaces such that $E=F\oplus W$, then the matrix of $f$ can be written as $\begin{pmatrix} A_1 & 0 \\ 0 & A_2 \end{pmatrix}$ in the basis $B_1\cup B_2$ if $B_1$ is a basis of $F,B_2$ of $W$, and $A_1$ is the matrix of the restriction-corestriction of $f$ to $F$ in the basis $B_1$, $A_2$ similarly but with $W$ and $B_2$