Inhomogeneous eigenvalue problem

585 Views Asked by At

I am looking for a method of solving eigenvalue problems of the form (I apologize if I gave a misleading title):

\begin{align} \mathbf{A}x + \lambda\mathbf{B}x-c = 0 \end{align}

In particular, I am looking to solve them numerically in matlab, but knowing the proper name for this type of problem would be super helpful!

3

There are 3 best solutions below

0
On

I guess you are trying to solve $(A-\lambda B)x=C $ for some non-zero $x$. For this to exist the vector $C $ needs to be in the column space of the matrix on the left. So this then becomes a matter of solving for $\lambda $ that makes the rank of the relevant matrices be the same. Have you tried?

0
On

It is quite a long time since you posted but I think I have some idea of solving it. First of, the title seems about right, it should be a inhomogeneous eigenvalue problem and the original article is found by searching for: "On Inhomogeneous Eigenvalue Problems" by R. M. M. Mattheij and G. Söderlind. I have also written a bit about it (if I may do some self-advertisement) in an article of my own "Worst-case compliance for independently constrained uncertain loads" by H. Hederberg and CJ. Thore, where the main focus lies of your question lies on page 8 as well as 11-12.

If your matrix $\mathbf{B}$ is positive definte, you can transform your problem from a generalized eigenvalue problem to a regular one by transformation with a positive definite matrix square root $\mathbf{B}^{1/2}$ (perhaps a Cholesky factor). So by multiplying by the iverse matrix square root of $\mathbf{B}^{-1/2}$ from the left we get
$$\begin{align} &\mathbf{B}^{-1/2}\mathbf{A}x + \lambda\mathbf{B}^{-1/2}\mathbf{B}x-\mathbf{B}^{-1/2}\mathbf{c} = 0 \Leftrightarrow\\ & \mathbf{B}^{-1/2}\mathbf{A}\mathbf{x} + \lambda\mathbf{B}^{1/2}\mathbf{x}-\mathbf{B}^{-1/2}\mathbf{c} = 0 \end{align}$$

and then if we transform the eigenvector $\mathbf{B}^{1/2}\mathbf{x} = \mathbf{y}$ we get

$$\begin{align} &\mathbf{B}^{-1/2}\mathbf{A}\mathbf{B}^{-1/2}\mathbf{y} + \lambda \mathbf{y}-\mathbf{B}^{-1/2}\mathbf{c} = 0 \Leftrightarrow\\ & \lambda \mathbf{y} = \mathbf{C}\mathbf{y}-\mathbf{d} \end{align}$$ where $\mathbf{C} = -\mathbf{B}^{-1/2}\mathbf{A}\mathbf{B}^{-1/2}$ and $\mathbf{d} = -\mathbf{B}^{-1/2}\mathbf{c}$ to fit the notation of the article. Typically a norm on the eigenvector is introduced as $\mathbf{y}^\mathsf{T}\mathbf{y}=1$.

The details of implementing will result in a quadratic eigenvalue problem which can be solved by transforming into a so called "companion matrix form". Look at page 10 of the article to find more details. The quadratic eigenvalue problem seems to have more research behind it. When the eigenvalue is found the eigenvector is obtained by $$ \begin{align} \mathbf{y} = (\mathbf{C}-\lambda\mathbf{I})^{-1}\mathbf{d} \end{align} $$ given that $\lambda$ is $\underline{\mathrm{not}}{}$ an eigenvalue to $(\mathbf{C}-\lambda\mathbf{I})\mathbf{v}=\mathbf{0}$. We then transform back $\mathbf{x}= \mathbf{B}^{-1/2}\mathbf{y}$ and you got your eigenvalue and eigenvector.

One can solve for the largest eigenvalue (even though experience tells me that it sometimes fails to converge to the largest and opts for the second largest instead...) by using the power method explained on page 11 of the article.

The answer got rather long so I skipped a few details about implementation but hopefully you (or I guess other future eigenvalue enthusiasts) can go from here. Please feel free to ask again if something is unclear.

0
On

You should probably add something to your question in order to get the response you want. Indeed, for every given $\lambda$ such that $A+\lambda B$ is full-rank, you can solve for a unique $x$ such that $(A+\lambda B)x=c$ using the solver of your choice which best applies, and there can be infinitely many such $\lambda$'s for which this situation applies. If, however, $A+\lambda B$ is singular, then the system only has solutions if $c$ is in the range of $A+\lambda B$. If that is indeed the case, then there are infinitely many solutions, i.e., for every $x$ such that $Ax+\lambda Bx-c=0$, you also have $A(x+y)+\lambda B(x+y)-c=0$ for any $y$ which lies within the null space of $A+\lambda B$. If this is the situation you find yourself in, you might be able to get one valid solution using a Krylov subspace solver.

But I don't think this is the answer you're searching for. You probably consider $\lambda$ as part of the solution, and not as something which is given a priori. If that is indeed the case, you need to add an equation to your system, most likely a constraint on a norm of $x$. In Mattheij and Soderlind (1987), they assume $\|x\|_2=1$. In your case, we may speak of a generalized inhomogeneous eigenvalue problem, i.e., $B\neq I$, in which case it seems reasonable to set $x^HBx=1$, as it is somewhat of a convention for generalized eigensolvers. I do not think this particular inhomogeneous problem has been treated before, but assuming $A$, and particularly $B$ are real symmetric positive definite, this is probably as small of an epsilon as you will get from the work of Mattheij and Soderlind (1987), in which case you can extend their work and maybe come up with your own power method for this specific problem, which will likely require a factorization or an inner-solver for $B$.