$T \in \mathcal{L}(V)$ is a complex finite-dim linear operator. 8.33 proves that if $T$ is invertible, it must have a square root.
It shows that $T |_{G(\lambda_i, T)} = \lambda_i (I + \frac{N_i}{\lambda_i})$ has a square root, where $N_i \in \mathcal{L}(G(\lambda_i, T))$ is a nilpotent, $G(\lambda_i, T)$ is generalized eigenvectors space and $\{ \lambda_i \}$ is a set of distinct eigenvalues of $T$.
Finally, the proof finishes by saying that the operator $Rv = R_1 u_1 + ... + R_m u_m$, where $R_i = \sqrt{T |_{G(\lambda_i, T)}} = \sqrt{\lambda_i (I + \frac{N_i}{\lambda_i})}$, is indeed the square root of $T$ and $u_i \in G(\lambda_i, T)$.
I'm having trouble verifying this.
By applying $R$ expansion to the right side I get
$R_1 (R_1 u_1 + ... R_m u_m) + ... + R_m(R_1 u_1 + ... + R_m u_m) = R_1^2u_1 + ... + R_m^2u_m + R_1(R_2u_2 + ... + R_m u_m) + ... + R_m(R_1 u_1 + ... + R_{m-1} u_{m-1})$
$R^2_i u_i = \lambda_i (I + \frac{N_i}{\lambda_i}) u_i = T|_{G(\lambda_i, T)} u_i$, is what I need, but the rest of the terms are excessive.
So I need to show that $R_1(R_2u_2 + ... + R_m u_m) + ... + R_m(R_1 u_1 + ... + R_{m-1} u_{m-1}) = 0$.
Q1) $R_i$ is defined on the elements of $G(\lambda_i, T)$, but in the equation above, $R_1$, for instance, needs to map an element of $G(\lambda_2, T) + ... + G(\lambda_m, T)$. It's probably going to map to $0$, but I'm confused how a linear map restricted to $G(\lambda_i, T)$ is to deal with elements outside this set.
Q2) If $R_i u_j \neq 0$ for $i\neq j$, then what is it equal to?
Edit: I realized I applied $R$ incorrectly. Applying it correctly makes the verification simple.
Are you familiar with the Jordan normalform of a matrix? This states that any complex (square) matrix $T$ can be written as $ T = S \Lambda S^{-1} $, where $\Lambda$ is block diagonal and $S$ consists of the (generalized) eigenvectors of $T$. Hence changing the basis to $S$, the matrix $T$ will be block diagonal. This means that the vector space $V$ can be written as $V = \oplus_{i=1}^m V_i $, where $V_i$'s are invariant under the action of $T$. Thus for any $u\in V$ $$ Tu = T_1 u_1 + T_2 u_2 + \ldots + T_m u_m $$ holds, where $T_i T_j = T_jT_i = 0$ for $i\neq j$. This also implies that, due to the definition of the $R_j$s, $R_i R_j = R_j R_i = 0$ for $i\neq j$.