I have a matrix of the following form: $$K = \begin{pmatrix}A & B \\\ B^{\intercal} & C\end{pmatrix}$$
where $A$ is large compared to $B$ and $C$, and $A$ and $C$ are symmetrical. The $K^{-1}$ has been computed and is known.
Say $A$ is an $m \times m$ matrix and $B$ is an $n \times n$ matrix.
Now, if I update $C$ and now it is an $(n+a) \times (n+a)$ matrix with completely different values from before $(a \lt n)$, is there some way I can compute the inverse of updated $K'$ while reusing some of the previous work from my computation of $K^{-1}$?
$A$, $B$, and $C$ need not be diagonal
I found the answer. Thought I'd post it here.
Let the matrix $K$ and its inverse $K^{-1}$ be partitioned into
$K = \begin{pmatrix} A & B \\ B^{\intercal} & C \end{pmatrix}$ and $K^{-1} = \begin{pmatrix} \tilde{A} & \tilde{B_1} \\ \tilde{B_2} & \tilde{C} \end{pmatrix}$
where $A$ and $\tilde{A}$ are $m \times m$ matrices and $C$ and $\tilde{C}$ are $n \times n$ matrices.
Submatrices of $A^{-1}$ are:
$\tilde{A} = A^{-1} + A^{-1}BMB^{\intercal}A^{-1}$
$\tilde{B_1} = -A^{-1}BM$
$\tilde{B_2} = -MB^{\intercal}A^{-1}$
$\tilde{C} = M$
where $M = (C-B^{\intercal}A^{-1}B)^{-1}$
Therefore, if compute $\tilde{A}$ the first time, I can reuse that in a subsequent computation of $K^{-1}$ \
Ref: Gaussian Processes for Machine Learning
by Carl Edward Rasmussen and Christopher K. I. Williams
Appendix A.3
http://www.gaussianprocess.org/gpml/chapters/RWA.pdf