Solving characteristic equation with matrix coefficients.

108 Views Asked by At

Given the equation \begin{equation}\det(\lambda^2 I+ B \lambda +K )=0\end{equation} where $I,B,K\in \mathbb R^{m \times m}$. $B$ and $K$ are symmetric matrices with no zero eigenvalues and $B>0$. Let $n^+(A)$ and $n^-(A)$ denote the number of eigenvalues of $A$ with positive real parts and negative real parts respectively. Is the number of roots of the above equation with positive real parts equal to $n^-(B)+n^-(K)$ and the number of roots with negative real parts equal to $n^+(B)+n^+(K)$?

My trial was as follows: let $v \in \mathbb{ R^m}$ if $K$ is positive definite then $$\det(\lambda^2 I+ B \lambda +K )=0 \implies \exists v: v^T(\lambda^2 I+ B \lambda +K)v=0 \implies \lambda^2 v^2+ v^TBv \lambda +v^TKv=0.$$ All the coefficients are positive, so real part of $\lambda$ must be negative. The same argument can be used when $K<0$, the remaining case is when $K$ is indefinite.


Edit

As Ben Grossmann showed the system can be seen as the characteristic equation of: $$ J = \pmatrix{0&&I \\ -K && -B}. $$ where $B>0$, $J$ cannot have an eigenvalue with purely imaginary part if it did then $ \exists v: v^H((bi)^2 I+ B (bi) +K)v=0 \implies v^HBvb=0 \iff b=0$

Let $n_0(A)$ denote the number of zero eigenvalues of A.

lemma 1: $n_0(K)=n_0(J)$

we observe that the reduced echelon form of $J$ is: $$ J_{red} = \pmatrix{I&&0\\0 && K}. $$ so $n_0(K)=n_0(J)$.

Let $G(t)=K+t I$ and let $$ J(t)=\pmatrix{0&&I\\-G(t) &&-B} $$ observe that as $t\rightarrow \infty$,$G(t)>0 \implies n_-(J(t))=2m$ and as $t\rightarrow -\infty$ , $G(t)<0 \implies n_-(J(t))=n_+(J(t))=m$. Let the eigenvalues of $J(t)$ be $\{\lambda_i(t)\}$.

As $t$ goes from $-\infty$ to $+\infty$, the eigenvalues of $G(t)$ become zero $m$ times which is when they change signs. By lemma 1, the eigenvalues of $J(t)$ must also become zero $m$ times and those times are exactly when eigenvalues of $G(t)$ are zero. It is not difficult to see that the eigenvalues $J(t)$ are zero exactly when one of the positive eigenvalues change signs to negative, if not then one of the positive eigenvalues of $J(t)$ won't change signs and would remain positive as $t \rightarrow +\infty$ which is a contradiction.

So every time an eigenvalue of $G(t)$ changes signs from negative to positive, an eigenvalue of $J(t)$ will change signs from positive to negative, so we have that $\forall t$ , $n_-(J(t))=m+n_+(G(t))=n_+(B)+n_+(G(t))$, taking $t=0$ yields $n_-(J)=n_+(B)+n_+(K)$ as required.

Is there something wrong with this proof?

1

There are 1 best solutions below

5
On

A partial (but non-trivial) answer: if $K$ is negative definite, the number of positive/negative roots can be determined as follows.

First, observe that $\det(\lambda^2 I + \lambda B + K) = \det(\lambda I - M)$, where $$ M_1 = \pmatrix{-B & -K\\I & 0}. $$ This holds regardless of the definiteness of $K$. If $K$ is negative definite, then there exists a unique positive definite square-root $P$ of $-K$ (i.e. $P^2 = -K$). We note that the above matrix $M_1$ is similar to $$ M_2 = \pmatrix{I\\ & P} \pmatrix{-B&-K\\I & 0}\pmatrix{I \\ & P}^{-1} = \pmatrix{-B & P\\P & 0}. $$ The above matrix is symmetric and thus has real eigenvalues. Furthermore, since $B$ is invertible, we have $$ n_\pm(M_2) = n_{\pm}(-B) + n_{\pm}(C), $$ where $C$ denotes the Schur complement $$ C = M_2/(-B) = PB^{-1}P. $$ Note that $n_{\pm}(PB^{-1}P) = n_{\pm}(B^{-1}) = n_{\pm}(B)$. With that, we can conclude that regardless of the definiteness of $B$, the number of roots with positive real part and negative real part are both equal to $m$.


For the case where $K$ is positive definite, there is a positive definite (and symmetric) $P$ for which $K = P^2$. Applying the same manipulation as before leaves us with the matrix $$ M_2 = \pmatrix{-B & -P\\ P & 0}. $$ We note that the symmetric part of this matrix, $$ M_S = \frac 12(M_2 + M_2^T) = \pmatrix{-B&0\\0 & 0}, $$ is negative semidefinite whenever $B$ is positive definite. It follows that $M_2$ will only have eigenvalues with non-positive real part in the case that $B$ is positive definite.


Thoughts on the general case:

For the general case, with $B$ positive definite. Let $P$ be such that $PKP^T$ is of the form $\operatorname{diag}(I_{n_+},-I_{n_-})$.

$$ M_2 = \pmatrix{P\\& P^{-T}} \pmatrix{-B & -K\\ I & 0} \pmatrix{P \\ & P^{-T}}^{-1} = \pmatrix{-PBP^{-1} & -PKP^T\\ (PP^T)^{-1} & 0} $$