I had to calculate the eigenvalues of the following matrix.
$$H=h\begin{pmatrix}A+\frac{1}{2}(B+C) & = & \frac{1}{2}(B-C) \\ 0 & B+C & 0 \\ \frac{1}{2}(B-C) & 0 & A+\frac{1}{2}(B+C)\end{pmatrix}$$
for that, I calculated the characteristic polynomial
$$ \text{char}(\lambda)=\det(H-\lambda Id_3) $$
which I did as one usually does with the Laplace Expansion. The master solution is
$$(A+B-\lambda)(A+C-\lambda)(B+C-\lambda)=0$$
Now that's a nice polynomial. I'm wondering, if I'm missing something here. My approach by calculating the determinant was seemed way more complex. Did they just rewrite the polynomial nicely or am I missing something here which would give me the solution more easily?
The eigenvalue of $B+C$ follows because subtracting $(B+C)I$ yields a matrix with a row of zeroes (and hence determinant zero). To find the remaining two eigenvalues, we ignore the middle column and row (since this just adds a factor of $(B+C-\lambda)$ to our polynomial), and then observe that subtracting $(A+B)I$ or $(A+C)I$ again yields a matrix with zero determinant.