the solution of matrix polynomials

78 Views Asked by At

In order to get the eigenvalues of \begin{equation} P=\left[ \begin{array}{cc} 0_{n\times n} & I_{n\times n} \\ -A & -B% \end{array} \right], \end{equation} where $A$ and $B$ are both $n\times n$ symmetric positive semi-definite matrices. The row sums of $A$ and $B$ are both $0$. They can be considered as Laplacian matrices of undirected graph.

\begin{equation} \lambda I-P=\left[ \begin{array}{cc} \lambda I_{n\times n} & -I_{n\times n} \\ A & \lambda I_{n\times n}+B% \end{array} \right] , \end{equation} then \begin{equation} \det \left( \lambda I-P\right) =\det \left( \lambda ^{2}I_{n\times n}+\lambda B+A\right) . \end{equation} Suppose the eigenvalues of $A$ and $B$ are respectively $0=\alpha _{1}\leq \alpha _{2}\leq \cdots \leq \alpha _{n}$ and $0=\beta _{1}\leq \beta _{2}\leq \cdots \leq \beta _{n}$. What is the relationship between the eigenvalues of $P$ and the eigenvalues of $A$ and $B$.

1

There are 1 best solutions below

2
On BEST ANSWER

If $A$ and $B$ commute, you can simultaneously diagonalize and the eigenvalues of $P$ will be roots of quadratics of the form $\lambda^2 +\beta_r \lambda +\alpha_r$. If $A$ and $B$ do not commute, I fear there little that be said usefully.

Having $A$ and $B$ are symmetric and positive semidefinite does not help. Your matrix $P$ can be viewed as a block companion matrix, for whatever that is worth.