Diagonalizing a nearly-diagonal matrix

912 Views Asked by At

Take a (kind of) arrowhead real-symmetric matrix of the general form

$$ M = \begin{bmatrix} a_{11} & a_{12} & a_{13} & a_{14} & a_{15} & a_{16} \\ a_{12} & a_{22} & a_{23} & a_{24} & a_{25} & a_{26} \\ a_{13} & a_{23} & a_{33} & 0 & 0 & 0 \\ a_{14} & a_{24} & 0 & a_{44} & 0 & 0 \\ a_{15} & a_{25} & 0 & 0 & a_{55} & 0 \\ a_{16} & a_{26} & 0 & 0 & 0 & a_{66} \\ \end{bmatrix} $$

where the size of the blocks may vary, however in general, the diagonal submatrix will be of dimension close to that of the entire matrix. Is there a method to diagonalise this matrix which takes advantage of this largely diagonal structure? My desire is computational efficiency, i.e. compared to dgemm.

I require all of the eigenvalues and eigenvectors of this matrix, i.e. $V^{-1}MV = W$ where $V$ are the eigenvectors of $M$, and $W$ a diagonal matrix containing the eigenvalues.

1

There are 1 best solutions below

0
On

One way of levering this structure to gain efficiency is by noticing that if the diagonal block is large compared to the rest, then you gain save a lot of time by using Krylov methods, as the matrix-vector product can be computed quite fast, and the full matrix need not be stored in memory. The simplest version of this is probably the Arnoldi algorithm, which a lot of computing environments do under the hood. To my knowledge, both MATLAB's "eigs" and Scipy's "sparse.linalg.eigs" are wrappers for a version of the Arnoldi algorithm found in ARPACK, although I do not know any more details.