The following (roughly) is written in the Adjugate Matrix Wikipedia page:
If
$$ p(t)~{\stackrel {\text{def}}{=}}~\det(t\mathbf {I} -\mathbf {A} )=\sum _{i=0}^{n}p_{i}t^{i}$$
is the characteristic polynomial of the real matrix $n$-by-$n$ matrix $\mathbf A$, then
$$ \operatorname{adj}(s\mathbf{I}-\mathbf{A}) = \mathrm{\Delta} p(s,\mathbf{A})$$
where
$$ \mathrm {\Delta } p(s,t)~=\sum _{j=0}^{n-1}\sum _{i=0}^{n-j-1}p_{i+j+1}s^{i}t^{j} $$
is the first divided difference of $p$.
Can anyone prove this, or provide a reference, please.
I'm most interested in the case where $s=1$, since this then gives me a nice polynomial expansion of $\operatorname{adj}(\mathbf{I}-\mathbf{A})$ in terms of powers of $\mathbf{A}$, which is well-known already, I would guess. A proof or a reference for this special case would be welcome, too.
I am assuming we are working over the field of complex numbers.
We have $(s-t) \Delta p(s,t) = p(s) - p(t).$
It is also easy to see that, given a real $s$ and matrix $A$ we have $\Delta p(s,A) = \Delta p(sI,A)$ (where the polynomial on the RHS takes two matrix arguments) since $s^jA^k = (sI)^jA^k$ for any non-negative integers $j,k$.
So for any complex $s$ we can see by comparing coefficients $$ (sI - A) \Delta p(s,A) = (sI - A) \Delta p(sI,A) = p(sI) - p(A) = p(sI) = p(s)I, $$ as $p(A) = 0$ by Cayley-Hamilton and $p(sI) = p(s)I = \det(sI-A)I$.
If $s$ is such that $sI-A$ is invertible, we have $\Delta p(s,A) = (sI-A)^{-1}p(s)I = \textrm{adj}(sI-A).$
Comparing componentwise, we have the equality of $n^2$ polynomials, and two polynomials are identical if they are equal at infinitely many points, and since $sI - A$ is invertible for all but finitely many $s$ we must have equality for all $s$