Confusing statement regarding Routh-Hurwitz criterion

145 Views Asked by At

I'm currently reading "On the Solutions and the Steady States of a Master Equation" by Joel Keizer. Keizer introduces a matrix $\Lambda$ with the following properties: $-\Lambda_{ij} \geq 0$ and $\sum_{j \neq i} \Lambda_{ji}=-\Lambda_{ii}$.

After that, he wants to prove some statements concerning this matrix but I struggle to understand a remark in this proof.

The statement:

(a) there is at least one eigenvector of $-\Lambda$ with eigenvalue zero, and all of the nonzero eigenvalues have negative real parts of magnitude less than $2 \max_i{\Lambda_{ii}}$. Furthermore, the zero eigenvector is unique if the graph of $\Lambda$ is strongly connected, that is, if all the states are accessible.

(b) ... (c) ...

The proof starts with this remark:

Before proving statement (a), note that in case some of the components $\Lambda_{ij}$ are zero, the validity of the first part of the statement does not follow directly from the Routh-Hurwitz criterion.

I'm not sure what exactly he means with the remark. So my questions are:

  1. What part does he mean with "first part of the statement"?

  2. Why does this part follow directly from the Routh-Hurwitz criterion if all components are non-zero?

  3. Why does it not follow from the Routh-Hurwitz criterion if some components are zero?

I am grateful for every suggestion.

1

There are 1 best solutions below

0
On BEST ANSWER

First of all, please note that this is an old paper and that the current knowledge is much higher.

  1. The first part of the statement (a) is that part that pertains to the existence of zero eigenvalues and the fact that the real part of the other eigenvalues be with negative real part. The fact that the matrix has zero eigenvalues can be easily deduced from the characteristic polynomial which will take the form $P(s)=s^kP_r(s)$ where $k$ is the algebraic multiplicity of the zero eigenvalue. Then, the stability of the remaining eigenvalues can be checked from the analysis of $P_r(s)$.

    One can check whether a polynomial $Q(x)$ has a roots with real parts less than $\alpha$ by looking at the stability of the polynomial $Q(y+\alpha)$. For instance, if we have $Q(x)=(x+1)(x+2)$, one can consider $Q(y-1+\epsilon)=(y+\epsilon)(y+1+\epsilon)$ which is Hurwitz stable for all $\epsilon>0$. As a result, we have that the roots have a real part less or equal to $-1$.

    The issue is that it is complicated to consider as the coefficients of the characteristic polynomial are complicated functions of the entries of the matrix, which obey very simple rules than can be more easily exploited using matrix analysis.

  2. It does not.

  3. See first point.

The author is just trying to make a point why the Perron-Frobenius theory is beneficial here. While that may not have been obvious at the time, this theory is now well-known and widely used. For instance, it allows one to say that if the graph of $\Lambda$ is strongly connected or, equivalently, that $\Lambda$ is irreducible, then the the eigenvalue zero is unique and its associated eigenvector is positive. This is a fundamental result used in the analysis of Markov chains for instance.