Let $\mathbb F \in \{\mathbb C, \mathbb R\}$. Define the operator $T_r $ on $\mathbb F^2$ by $T(x)=\begin{bmatrix} 1 & 1 \\ r^2 & 1\end{bmatrix}$, where $r\in \mathbb R$. I want to prove that when $0< r < 1$, then $T_r$ is diagonalizable.
I started by considering the following.
Suppose $\lambda \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} =\begin{bmatrix} 1 & 1 \\r^2 & 1\end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix}$. Then $\begin{bmatrix} x_1 +x_2 \\ x_1r^2 +x_2 \end{bmatrix} =\begin{bmatrix} \lambda x_1 \\ \lambda x_2 \end{bmatrix}$, so that $\lambda x_1 =x_1+x_2$ and $\lambda x_2 =x_1r^2+x_2$. This forces $x_2=(\lambda-1)x_1$.
But I am not sure if continuing in this way is efficient. Would taking the characteristic polynomial be quicker? If so, is there a trick for knowing why?
Recall that if the characteristic polynomial splits into distinct linear factors, then $T$ is diagonalizable. The characteristic polynomial is $$P_T(X)=(X-1)^2-r^2= X^2-2X+1-r^2.$$ The roots are thus $X_1=1+\frac{1}{2}\sqrt{4-4(1-r^2)}=1+r$ and $X_2=1-r$.
Since $r\neq 0$ there are two distinct eigenvalues and thus two linearly independent eigenvectors. This shows that $T$ is diagonalizable. From here it's also easy to calculate the corresponding eigenvector (although we don't need them to show that $T$ is diagonalizable).
Indeed, to find an eigenvector for $1+r$ we have to find the nullspace of $$T-(1+r)Id=\begin{pmatrix} -r&1\\r^2&-r \end{pmatrix}\rightarrow \begin{pmatrix} r&1\\ 0&0 \end{pmatrix}.$$ A solution is given by $v_1=(1,-r)$. Similarly you can find the other eigenvector. Notice that I didn't have to know the row operations to row reduce the matrix $T-(1+r)Id$. Indeed, the rows had o bo linearly depedendent because the system has a solution.