Background: Let $H, H_0$ be real square matrices, we want to estimate the eigenvalues/vectors of $$H = H_0 + \epsilon R$$ where $0 \leq \epsilon \leq 1$, those of $H_0$ are assumed to be known.
Let $\lambda_0$, $v_0$ be the eigenvalue / eigenvector pair of $H_0$: $H_0 v_0= \lambda_0v_0$.
For small $\epsilon$, we can assume that the value / eigenvector pair verifies $Hv = \lambda v$, and that $\lambda \rightarrow \lambda_0$, $v \rightarrow v_0$ as $\epsilon \rightarrow 0$. $$\lambda = \lambda_0 + \epsilon \lambda_1 + \epsilon^2 \lambda_2+ \epsilon^3 \lambda_3+... $$ $$v = v_0 + \epsilon v_1 + \epsilon^2 v_2+ \epsilon^3 v_3+... $$ where $ v_1, v_2, v_3,...$, $ \lambda_1, \lambda_2, \lambda_3,...$ are the vectors/values that we want to calculate. Submitting the above equalities in $Hv=\lambda v$, we get the following, $$(H_0 +\epsilon R)(v_0 + \epsilon v_1 + \epsilon^2 v_2+ ...) = (\lambda_0 + \epsilon \lambda_1 + \epsilon^2 \lambda_2+ ...) (v_0 + \epsilon v_1 + \epsilon^2 v_2+ ...) $$
The first three orders in $\epsilon$ $$(H_0 - \lambda_0)v_0=0 $$ $$(H_0 - \lambda_0)v_1 = \lambda_1 v_0 -Rv_0 $$ $$ (H_0 - \lambda_0)v_2 = \lambda_2 v_0 + \lambda_1 v_1 -Rv_1$$ $$...$$
Also we have ((..,..) is the the scalar product)
$$\lambda_1 = \frac{(w, Rv_0) }{(w, v_0)}$$
$$\lambda_2 =\frac{(w, Rv_1 - \lambda_1 v_1) }{(w, v_0)}$$
$$...$$
where w stasistfies the following $(H_0^T- \bar{\lambda_0} )w = 0$
Question: If we take $\epsilon =1$, $H$ and $H_0$ are symmetric matrices, and $R$ is a diagonal matrix.
In this case can we say that ? $$\lambda_i \rightarrow 0,\text{ as }i\rightarrow \infty ?$$
I have noticed that it is always the case of many different symmetric matrices $H$ and $H_0$ (where $R$ diagonal and $\epsilon =1$) but I do not know how to prove it.
P.S. Maybe the answer is pretty basic, I am not doing research in eigenvalue perturbation theory, but this situation arise in my work.
Any idea will be useful!!