$$\text{det}\begin{bmatrix}\lambda-(1-a) & -a \\ -b & \lambda - (1-b)\end{bmatrix}=0$$
$$\Big(\lambda-(1-a)\Big)~\Big(\lambda - (1-b)\Big) -(-a)(-b) = 0$$
$$\Big(\lambda-(1-a)\Big)~\Big(\lambda - (1-b)\Big) +ab = 0$$
The textbook says the answer is:
$$(\lambda -1)(\lambda -1 +a +b)=0$$
How did they get this? Every time I try this problem... I get about two pages of multiplying terms out into polynomials and applying quadric formula... there's got to be a faster 2-3 liner way to factor this...
Performing the transformation $C_1\mapsto C_1+C_2$, we get $$\left| \begin{matrix} \lambda -1 & -a \\ \lambda-1 & \lambda-(1-b) \end{matrix}\right|=0 \\ \implies (\lambda-1) \left| \begin{matrix} 1& -a \\ 1 & \lambda-(1-b) \end{matrix} \right|=0 $$ Now try expanding the determinant.