Convert $(A \lambda^2 + B \lambda + C) \vec{x} = \vec{0}$ to eigenvalue equation

121 Views Asked by At

I want to convert the following equation to an eigenvalue equation, $$(A \lambda^2 + B \lambda + C) \vec{x} = \vec{0},$$ for eigenvalues $\lambda$ given that $A$ is invertible.

So far I have used the property that $A$ is invertible to simplify as, $$A^{-1}(A \lambda^2 + B \lambda + C) \vec{x} = \vec{0},$$ $$(I \lambda^2 + A^{-1}B \lambda + A^{-1}C) \vec{x} = \vec{0}.$$ From here I do not know how to get the matrix equation on the left hand side into the form of $(Q-I\lambda)\vec{x}=\vec{0}$. Am I on the right track?

Can I use the quadratic formula as, $$Q_{1,2} = -\frac{A^{-1}B}{2} \pm \sqrt{\left(\frac{A^{-1}B}{2} \right)^2 - A^{-1}C},$$ and then have "left" and "right" eigenvalue equations as, $$(Q_1 - I\lambda)(Q_2 - I\lambda)\vec{x} = \vec{0}.$$ Is that a valid solution?

Edit: A, B, and C are $n\times n$ matrices.

2

There are 2 best solutions below

6
On

You can not get rid of the square in the same dimension, but you can "linearize" your martix pencil in higher dimensions. Namely, the eigenvalues of the $2n\times 2n$-matrix $$ \begin{pmatrix}-A^{-1}B&-A^{-1}C\\I&0\end{pmatrix} $$ are exactly those values $\lambda$ for which $\lambda^2 A + \lambda B +C$ is singular.

0
On

You have remarked that your issue is equivalent to find $x,\lambda$ such that

$$(I \lambda^2 + A^{-1}B \lambda + A^{-1}C) \vec{x} = \vec{0}.$$

Otherwise said, setting $D:=A^{-1}B$ and $E:=A^{-1}C$, our problem is equivalent to find $\lambda$ sch that :

$$\det(I \lambda^2 + D \lambda + E)=0 \tag{1}$$

(indeed, if the determinant of this matrix is zero, its kernel will not be reduced to $0$).

Let us construct a family of solutions.

Let us recall the Schur determinant lemma (http://files.ele-math.com/articles/jmi-03-16.pdf) valid for any $2 \times 2$ block matrix provided its upper left block is invertible :

$$\det\left(\begin{matrix}A&B\\C&D\end{matrix}\right)=\det(A)\det(D-CA^{-1}B)$$

Let us apply this result to the following matrix (where $I$ is the $n \times n$ identity matrix) where

$$\det\left(\begin{matrix}-\lambda I&V\\U&(W-\lambda I)\end{matrix}\right)$$

$$=\det(-\lambda I)\det(W-\lambda I-U(\lambda I)^{-1}V)\tag{2}$$ $$=\det(-\lambda I(W-\lambda I+\tfrac{1}{\lambda}IUV))$$

$$=\det(-\lambda W+\lambda^2 I-UV)$$

Identifying this with (1), it suffices to take

$$W=-D \ \text{and} \ U,V \ \text{such that} \ UV=-E,$$

for example with $U=-E$ and $V=I$ (similar to the solution provided by @amsmath), but we can use as well $U=-EF,V=F^{-1}$ for any invertible $F$....

Conclusion : In view of (2), a rather general form of matrices whose eigenvalues are solutions of (1) is :

$$\left(\begin{matrix}0&F^{-1}\\-EF&-D\end{matrix}\right) \ \text{for any invertible} \ F$$