After playing around in Sympy I found that if I express the transition matrix of a Markov process as the identity plus a secondary matrix, $P'$, then I get an interesting result: the coefficients of the characteristic polynomial of $P'$ are elementary symmetric polynomials of it's non-diagonal entries.
$$ P = \\ \begin{bmatrix} 1-(P_{12}+P_{13} +\dots) & P_{12} & P_{13} & \dots & P_{1n}\\ P_{21} & 1-(P_{21}+P_{23} +\dots) & P_{23} & \dots & P_{2n}\\ P_{31} & P_{32} & 1-(P_{31}+P_{32} +\dots) & \dots & P_{3n}\\ \vdots & \vdots & \vdots & \ddots & \\ P_{n1} & P_{n2} & P_{n3} & & 1-(P_{n1}+P_{n2} +\dots) \end{bmatrix} = I-P' \\ \chi(P') = \lambda^{n}+\lambda^{n-1} e_{1}(P_{12},P_{13},\dots)+\lambda^{n-2} e_{2}(P_{12},P_{13},\dots)+\dots+\lambda e_{n-1}(P_{12},P_{13},\dots) $$
This empirical result looks like it hasn't been discovered, or is at least obscure. I'm having trouble finding confirmation for it. I've been trying to find out how one would go about proving it. The only helpful thing I've found on the internet is that the coefficients of the characteristic polynomial are the sums of the principal minors. I've been able to use this fact to show that it's true for the first and second coefficients, but proving it for all of them is turning out to be a pain.
Any help is much appreciated.
Example for three states: $$ \chi(P') = \lambda^{3}+\lambda^{2}(P_{12}+P_{13}+P_{21}+P_{23}+P_{31}+P_{32})\\ +\lambda(P_{12}P_{13}+P_{12}P_{21}+P_{12}P_{23}+P_{12}P_{31}+P_{12}P_{32}\\ +P_{13}P_{21}+P_{13}P_{23}+P_{13}P_{31}+P_{13}P_{32}\\ +P_{21}P_{23}+P_{21}P_{31}+P_{21}P_{32}\\ +P_{23}P_{31}+P_{23}P_{32}\\ +P_{31}P_{32}) $$