"Rank-K Correction" of a matrix and significance?

2.1k Views Asked by At

Today my studies led me to read about the matrix inversion lemma, which Wikipedia introduces as follows:

In mathematics (specifically linear algebra), the Woodbury matrix identity, named after Max A. Woodbury[1][2] says that the inverse of a rank-k correction of some matrix can be computed by doing a rank-k correction to the inverse of the original matrix.

After some searching I couldn't find any explanation of the terminology rank-$k$ correction, but I did find a Wikipedia article about the BFGS algorithm that uses the same term:

Instead, the Hessian matrix is approximated using rank-one updates specified by gradient evaluations... the approximate Hessian at stage $k$ is updated by the addition of two matrices $$B_{k+1} = B_k + U_k + V_k$$ Both $U_k$ and $V_k$ are symmetric rank-one matrices... [and together] construct a rank-two update matrix...

As far as I can tell from these sources, a rank-$k$ correction of matrix $A$ consists of the addition of a matrix $B$ of rank $k$ to $A$.

However, it is unclear to me what exactly the reason is for describing matrix addition in this way. Can someone help me understand the implications and applications of this terminology at a basic level, as well as its importance to optimization problems?