I have two matrices $A$ and $B$, symmetric and positive semi-definite (in fact, they are covariance matrices), and I am interested in computing the eigenvectors of the matrix $A^{-1}-B^{-1}$.
From numerical simulations, it appears that the eigenvectors of $A^{-1}-B^{-1}$ are often well approximated simply by the eigenvectors of $A-B$.
Moreover, if I regularize the matrices $A$ and $B$ before inverting them, by computing the matrix $(A+k\cdot I)^{-1} - (B+k\cdot I)^{-1} $ (where k is a positive constant, and $I$ is the identity matrix), the eigenvectors of $A-B$ provide an almost perfect approximation, especially if k is large.
Is there a way to prove that the eigenvectors of $A-B$ are a good approximation of the eigenvectors of $A^{-1}-B^{-1}$?
Edit: in light of your (very helpful) comments, I expanded my numerical tests to larger matrices, where I can see that it is not generally true that the eigenvectors of $A-B$ are close to those of $A^{-1}-B^{-1}$.
However, this still works if I consider the "regularized" versions of $A$ and $B$, i.e. the eigenvectors of $A-B$ correspond to the eigenvectors of $(A+k\cdot I)^{-1} - (B+k\cdot I)^{-1} $ (where k is a large positive constant, and $I$ is the identity matrix). Is there any way to prove this?
Edit 2: I think I might have found the proof for this. There is a well known result that $(I-A)^{-1}= I+A+A^2+A^3+...$ (this can be shown by multiplying both sides by $I-A$). My expressions $A+k \cdot I$ can be rewritten as $\epsilon \cdot A +I$ (for small $\epsilon$), and so I can approximate $(\epsilon \cdot A +I)^{-1}$ with $I-A$ because I drop the following terms.
Finally I have:
$(A+k\cdot I)^{-1} - (B+k\cdot I)^{-1} = (\epsilon \cdot A+ I)^{-1} - (\epsilon \cdot B+ I)^{-1} = (I-A)-(I-B) = B-A$
Please let me know if you have any other insights about this, and thanks again for your help!