Suppose we want to approximate the eigenvalue problem $$\left(A+\varepsilon B\right)x=\lambda x,$$ where $$ A=\begin{bmatrix}1&1&0\\0&1&0\\0&0&-1\end{bmatrix},\qquad B=\begin{bmatrix}1&0&0\\1&-1&0\\0&0&0\end{bmatrix}. $$
I solved the problem exactly and then expanded the eigenvalues. I found them to be $$\lambda=-1,1\pm\sqrt{\varepsilon}.$$
Now, if I look formally for an expansion of the form $$\lambda=\sum_{n=0}^{\infty}\varepsilon^{\frac{n}{2}}\lambda_n, \quad x=\sum_{n=0}^{\infty}\varepsilon^{\frac{n}{2}}x_n,$$ and then match at $\mathcal{O}\left(\varepsilon^{1\over 2}\right),$ I find $$Ax_1-\lambda_0 x_1=\lambda_1x_0.$$ This implies that $$\lambda_1 y_0^{\dagger}x_0=0,$$ where $y_0^{\dagger}$ is the left eigenvector of A. This doesn't give me a condition on $\lambda_1$ and I don't think matching to higher order does any good. What am I missing here?
The problem with your approach is that you assume you can expand $x$ as
$$x=\sum_{n=0}^{\infty}\varepsilon^{\frac{n}{2}}x_{n}$$
If you look at the exact eigenvectors given by Wolfram here, you can see that they contain the expression $\frac{\sqrt{\varepsilon\left(\varepsilon+1\right)}}{\varepsilon}$. This expression, when expanded into a series, must have a term $\varepsilon^{-\frac{1}{2}}$ which doesn't exist in your series representation of $x$. See here. Thus, you should take instead
$$x=\sum_{\color{red}{n=-1}}^{\infty}\varepsilon^{\frac{n}{2}}x_{n}$$
Now you get the equation
$$\left(A+\varepsilon B\right)x=\lambda x$$
$$\Downarrow\\\left(A+\varepsilon B\right)\left(\varepsilon^{-\frac{1}{2}}x_{-1}+x_{0}+\varepsilon^{\frac{1}{2}}x_{1}+\dots\right)=\left(\lambda_{0}+\varepsilon^{\frac{1}{2}}\lambda_{1}+\varepsilon\lambda_{2}+\dots\right) \left(\varepsilon^{-\frac{1}{2}}x_{-1}+x_{0}+\varepsilon^{\frac{1}{2}}x_{1}+\dots\right)$$
or for every order
$$\begin{array}{}\left(1\right)\\\left(2\right)\\\left(3\right)\end{array}\begin{cases}Ax_{-1}=\lambda_{0}x_{-1},&\mathcal{O}\left(\varepsilon^{-\frac{1}{2}}\right)\\ Ax_{0}=\lambda_{0}x_{0}+\lambda_{1}x_{-1},&\mathcal{O}\left(1\right)\\ Ax_{1}+Bx_{-1}=\lambda_{0}x_{1}+\lambda_{1}x_{0}+\lambda_{2}x_{-1},&\mathcal{O}\left(\varepsilon^{\frac{1}{2}}\right)\end{cases}$$
Taking $\lambda_{0}=1$ you can easily verify that the corresponding eigenvector is
$$x_{-1}=\begin{pmatrix}1\\0\\0\end{pmatrix}$$
Now, by multiplying the second equation by $e_{1}^{T}$, $e_{2}^{T}$ and $e_{3}^{T}$ from the left, we have
$$\begin{cases}\left(e_{1}^{T}+e_{2}^{T}\right)x_{0}=1\cdot e_{1}^{T}x_{0}+\lambda_{1}\\e_{2}^{T}x_{0}=1\cdot e_{2}^{T}x_{0}+0\\-e_{3}^{T}x_{0}=1\cdot e_{3}^{T}x_{0}+0\end{cases}$$
$$\Downarrow\\\begin{cases}e_{2}^{T}x_{0}=\lambda_{1}\\e_{3}^{T}x_{0}=0\end{cases}$$
Thus
$$x_{0}=\begin{pmatrix}\ast\\\lambda_{1}\\0\end{pmatrix}$$
Now lets do the same to the third equation with $e_{2}^{T}$
$$e_{2}^{T}Ax_{1}+e_{2}^{T}Bx_{-1}=\lambda_{0}e_{2}^{T}x_{1}+\lambda_{1}e_{2}^{T}x_{0}+\lambda_{2}e_{2}^{T}x_{-1}$$
$$e_{2}^{T}x_{1}+\left(e_{1}^{T}-e_{2}^{T}\right)x_{-1}=e_{2}^{T}x_{1}+\lambda^{2}_{1}$$
$$1=\lambda^{2}_{1}$$
So $\lambda_{1}=\pm 1$ and you get
$$\lambda\approx\lambda_{0}+\lambda_{1}\sqrt{\varepsilon}=1\pm\sqrt{\varepsilon}$$
as wanted.