I have a matrix of $9,200 \times 9,200$ elements. I have approximately $90$ of these matrices to invert.
The reason for this is I am running a nonlinear regression on a problem with significant errors in the independent and dependent variables. Each run, I update the covariance matrix of the dependent variable, and therefore requires inversion to include in the weighted regression problem.
The matrix has a sparse banded structure. The structure of the first $1,000$ values looks like this and the structure of the first $20$ values looks like this. That is, sets of $2 \times 2$ blocks running down the main diagonal, and then also off diagonals seperated by $368$ zeros. The matrix is also symmetric positive definite.
I believe this is called a banded matrix. Given only 0.5% of the matrix is non-zero, I believe there is a faster way of computing this, but am struggling.
To expand on George's answer, if we have a matrix consisting of diagonal matrices $D_{ij}$ like this :
$$M = \left[\begin{array}{ccc} D_{11}&\cdots &D_{1n}\\\vdots&\ddots&\vdots\\D_{n1}&\cdots&D_{nn} \end{array}\right]$$
We can always find a permutation similarity
$$A = PMP^{t}$$
So that $A = \left[\begin{array}{ccc} B_{1}&0 &0\\0&\ddots&0\\0&0&B_{m} \end{array}\right]$
In Gnu Octave or Matlab code this permutation transformation seems to be possible to express as a sparse matrix
Now inverting $A$ will be a simple case of inverting each of the $B$:s separately.