Eigenvalue-like problem for block matrices -- numerical solutions

36 Views Asked by At

Consider the following matrix equation $$ \left[\begin{array}{cc}{A_{11}} & {A_{12}} \\ {A_{21}} & {A_{22}}\end{array}\right]\left[\begin{array}{c}{v_1} \\ {v_2} \end{array}\right] = \left[\begin{array}{cc}{\lambda_1 I} & {0} \\ {0} & {\lambda_2 I}\end{array}\right]\left[\begin{array}{c}{v_1} \\ {v_2} \end{array}\right]$$

where $A_{ij}$ are known $N\times N$ matrices ($A_{11}$ , $A_{22}$ are symmetric tridiagonal and $A_{21}$, $A_{12}$ are diagonal). We need to solve for $\lambda_1$ and $\lambda_2$ (and optionally $v_1$, $v_2$). Another simplification is that the two numers are simple functions of a single number $x$: $\lambda_1(x)$ and $\lambda_2(x)$.

What is an efficient way to solve for the $\lambda$s (i.e. $x$) when $N\sim 3000$?

My first attempt was trying $$ \textrm{det} \left| \begin{array}{cc}{A_{11} - \lambda_1 I} & {A_{12}} \\ {A_{21}} & {A_{22} - \lambda_2 I}\end{array} \right| =0$$ but the determinants of the diagonal matrices are either very small or very large, so this is not suitable for any iterative numerical scheme.

Is there a way to cast this as an eigenvalue problem or are there efficient, numerically stable methods to solve something like this?