Consider the diagonalization of the following matrix as a series in $0 < \epsilon \ll 1$: $$ A = A_0 + \epsilon A_1 $$ where $$ A_0 = \frac{3}{2} \ \left[ \begin{matrix} 1 & 1 & 0 \\ 1 & 1 & 0 \\ -1 & -1 & 0 \\ \end{matrix} \right] \qquad \& \qquad A_1 = \left[ \begin{matrix} 2 & 0 & 1 \\ 3 & 0 & 1 \\ 1 & 0 & 0 \\ \end{matrix} \right] \ . $$
Suppose I am interested in finding the eigenvalues and eigenvectors to $\mathcal{O}(\epsilon^1)$. The characteristic polynomial for the eigenvalues $\lambda$ is $$ \lambda^3 - ( 3 + 2 \epsilon ) \lambda^2 - \frac{\epsilon}{2}(3 - 2 \epsilon) \lambda + \frac{3}{2} \epsilon^2 = 0 $$ Writing the eigenvalues as a series in $\epsilon$ I find I get: $$ \lambda_1 = 3 + \frac{3}{2} \epsilon + \mathcal{O}(\epsilon^2) \\ \lambda_2 = 0 - \frac{1}{2} \epsilon + \mathcal{O}(\epsilon^2) \\ \lambda_3 = 0 + \epsilon + \mathcal{O}(\epsilon^2) \\ $$ where $\lambda_2$ and $\lambda_3$ are degenerate if you turn off the perturbation with $\epsilon = 0$ (and so the perturbation "lifts" the degeneracy).
From there I take the eigenvalues $\lambda_i$ and the matrix $A$ and I solve for the eigenvectors $\vec{v}_i$ with the equation $$ A \vec{v}_i = \lambda_i \vec{v}_{i} $$ where I find that for example $$ \vec{v}_i = \left[ \begin{matrix} -3/7 \\ 3/7 \\ 1 \end{matrix} \right] + \epsilon \left[ \begin{matrix} a \\ \frac{1}{21} - a \\ 0 \end{matrix} \right] + \mathcal{O}(\epsilon^2) $$ where $a$ can be any constant. The way I computed the above is to match powers of $\mathcal{O}(\epsilon^1)$ in the equation $A \vec{v}_i = \lambda_i \vec{v}_{i}$ and I furthermore enforced that the bottom-most component is just 1.
My question I find that the eigenvalue equation $A \vec{v}_i = \lambda_i \vec{v}_{i}$ is satisfied at order $\epsilon^1$ for any constant $a$. What is puzzling to me is that $a$ can be any constant if we work to this order in perturbation theory. Why is this the case?
Is there a way of determining what the constant $a$ is without working to higher order in perturbation theory?