I would like to generalize this question to a diagonal matrix that has two rank-one updates which are both of a special form. In the previous question I asked why most of the eigenvalues of a diagonal $n\times n$ matrix D remain the same even if I perturb this matrix with a rank-one update $bk^T$, where b and k are n-dimensional vectors: $$D+bk^T$$ The eigenvalues of D are obviously its diagonal entries. Numerical evaluation showed that many of its eigenvalues remain the same even after adding the perturbation. The answer to this question gave a proof using Sylvester's determinant theorem and shows why rank-one updated diagonal matrices with eigenvalues of multiplicity k > 1 will have the exact same eigenvalues with multiplicity k-1. Instead of having a diagonal matrix with one rank-one update, I wondered whether a similar proof can be done for matrices that get two rank-one updates, i.e. $$D + bk^T + uv^T$$ where D is a $n\times n$ diagonal matrix, and b, k, u, v are n-dimensional vectors. Instead of arbitrary vectors as in the previous question, I take u and k to be of the form: $$ (*,*,0,0,0,\cdots)$$ and v and b to be of the form: $$ (0,0,*,*,*,\cdots) $$ The resulting matrix looks something like$$ \begin{pmatrix} * & 0 & *&* & * & * & \cdots\\ 0 & * & *&* & * & * & \cdots\\ * & * & *&0 & 0 & 0 & \cdots\\ * & * & 0&* & 0 & 0 & \cdots\\ * & * & 0&0 & * & 0 & \cdots\\ * & * & 0&0 & 0 & * & \cdots\\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots & \ddots \end{pmatrix}$$ where the diagonal elements come from D and the first two rows/columns come from the rank-one updates $uv^T$ and $bk^T$ respectively.
My numerical calculations show that in this case the eigenvalues of higher multiplicity also remain the same, at least partly. I tried to repeat the above proof from the previous question with two rank-one perturbations but I did not succeed. Can somebody help me out?
Edit: It would also be nice to get an analytical understanding about why numerically the first two eigenvalues of D are affected the most by this perturbation.
Let's ignore the fact that the original matrix is diagonal.
Suppose that we have a matrix, $A$, with an eigenvalue $\lambda$ of multiplicity $k$. Then we have the property that $$ (A-\lambda I)v=0 $$ where $v$ can be any combination of vectors $v_i$ for $i$ between $1$ and $k$, each of which also satisfy this equation and that are orthogonal to each other.
If we perform a rank-1 update to $A$, and then we now have $$ (A+bc^T-\lambda I)(v+w)=0 $$ for some $w$. Expanding our second equation and substituting in our first equation, we get $$ (A+bc^T-\lambda I)w +[c^T v]b=0 $$ where $c^T v$ is a scalar. If $c^T v=0$, then $w=0$ and there is no alteration to the eigenvector. Note, however, that $v = \sum \alpha_i v_i$, and so $c^T v = \sum \alpha_i [c^T v_i]$.
But you have $k-1$ degrees of freedom to ensure that $c^T v=0$, which means that, after the rank-1 update, eigenvalue $\lambda$ has multiplicity at least $k-1$.
But this didn't rely on $A$ being diagonal. Which means, after a rank-1 update, we can perform another rank-1 update - that is, we can perform a rank-1 update on $A+bc^T$. And the result is that the eigenvalue of multiplicity $k-1$ becomes an eigenvalue of multiplicity $k-2$.