Are there theorems which say a) when all the eigenvalues are below unity, b) when exactly one of them equals one? For example, can you deduce something from the row sums? I encountered Gershgorin's theorem, but it seems to be too weak for my purposes. Are there stronger versions of Gershgorin's theorem?
Specifically, the application I have in mind are the coefficient matrices of dynamical systems. I have matrices of type \begin{equation} \mathbf{A}=\left( \begin{matrix} \mathbf{a} & 0 \\ diag(n) & \mathbf{0} \end{matrix} \right),\end{equation} so you can pretty much tell that I have transformed a higher-order system into a first-order system. The lower parts of the matrix are always the same, but $\mathbf{a}$ varies. I want to know when the system stays stable, and to that end, I need the eigenvalues.