Given a $n\times n$ matrix $A$ over an algebraically closed field, let $\lambda_1,...,\lambda_k$ be its eigenvalues, and let $V_{\lambda_i}$ be the generalized eigenspace of $\lambda_i$. The question is to prove $⨁_{i=1}^dV_{λ_i }=V $. The generalized eigenspace is defined as the following,
$V_{\lambda_i}=\{x:(A-\lambda_i I)^{m(\lambda_i)}x=0\}$ where $m(\lambda_i)$ is the algebraic multiplicity of $\lambda_i$.
A proof from the textbook is as the following,
Let $d_i$ = $\dim V_{\lambda_i}. $Suppose $⨁_{i=1}^dV_{λ_i }\neq V $, then $\sum_{i=1}^k d_i<n$. Let $S_i$ be a basis of $V_{λ_i }$ containing $d_i$ linear independent vectors, and extend $(S_1,...,S_k)$ to a basis of $V$ by adding $(n-\sum_{i=1}^k d_i<n)$ linear independent vectors $S'$, then $S=(S_1,...,S_k,S')$ is a basis of $V$.
Note generalized eigenspaces are invariant subspaces, i.e. $AV_{\lambda_i}\subseteq V_{\lambda_i}$, thus every column of $AS_i$ is still in $V_{\lambda_i}$ and hence a linear combination of columns of $S_i$ for $1\le i\le k$, thus the following holds (NOTE the "*"s could be non-zero elements),
where $AS_i=SA_i$ for $i=1,...,k$
Then I get lost with the proof. The remaining part is the following,
I get the idea that the proof wants to find an eigenvector $x$ s.t. $Sx$ is a linear combination of only vectors in $S'$, then show $Sx$ is an eigenvector of some eigenvalue $\lambda_i$, and hence the contradiction because $Sx$ would be a vector in $V_{\lambda_i}$ that cannot be expressed as a linear combination of $S_i$. However, I don't understand why this proof is true, especially for the highlightened parts.

