Formula for adjugate of matrix: $\operatorname{adj}(s\mathbf{I}-\mathbf{A}) = \mathrm{\Delta} p(s,\mathbf{A})$

385 Views Asked by At

The following (roughly) is written in the Adjugate Matrix Wikipedia page:


If

$$ p(t)~{\stackrel {\text{def}}{=}}~\det(t\mathbf {I} -\mathbf {A} )=\sum _{i=0}^{n}p_{i}t^{i}$$

is the characteristic polynomial of the real matrix $n$-by-$n$ matrix $\mathbf A$, then

$$ \operatorname{adj}(s\mathbf{I}-\mathbf{A}) = \mathrm{\Delta} p(s,\mathbf{A})$$

where

$$ \mathrm {\Delta } p(s,t)~=\sum _{j=0}^{n-1}\sum _{i=0}^{n-j-1}p_{i+j+1}s^{i}t^{j} $$

is the first divided difference of $p$.


Can anyone prove this, or provide a reference, please.

I'm most interested in the case where $s=1$, since this then gives me a nice polynomial expansion of $\operatorname{adj}(\mathbf{I}-\mathbf{A})$ in terms of powers of $\mathbf{A}$, which is well-known already, I would guess. A proof or a reference for this special case would be welcome, too.

2

There are 2 best solutions below

1
On BEST ANSWER

I am assuming we are working over the field of complex numbers.

We have $(s-t) \Delta p(s,t) = p(s) - p(t).$

It is also easy to see that, given a real $s$ and matrix $A$ we have $\Delta p(s,A) = \Delta p(sI,A)$ (where the polynomial on the RHS takes two matrix arguments) since $s^jA^k = (sI)^jA^k$ for any non-negative integers $j,k$.

So for any complex $s$ we can see by comparing coefficients $$ (sI - A) \Delta p(s,A) = (sI - A) \Delta p(sI,A) = p(sI) - p(A) = p(sI) = p(s)I, $$ as $p(A) = 0$ by Cayley-Hamilton and $p(sI) = p(s)I = \det(sI-A)I$.

If $s$ is such that $sI-A$ is invertible, we have $\Delta p(s,A) = (sI-A)^{-1}p(s)I = \textrm{adj}(sI-A).$

Comparing componentwise, we have the equality of $n^2$ polynomials, and two polynomials are identical if they are equal at infinitely many points, and since $sI - A$ is invertible for all but finitely many $s$ we must have equality for all $s$

2
On

The statement is almost trivial if you employ the method of universal identities and assume that $A$ is diagonal. A more concrete proof is still not hard if you know the definition of an adjugate operator (as opposed to an adjugate matrix).

Without going down the full-fledged multilinear algebra path, an adjugate operator can be defined --- in a coordinate-free manner --- as follows. Let $f$ be the characteristic polynomial of a linear endomorphism $T$ on an $n$-dimensional vector space. Then $\operatorname{adj}(T)=g(T)$, where $g$ is the polynomial defined by $g(t)=(-1)^{n+1}\frac{f(t)-f(0)}{t-0}$. When a basis is chosen, so that the matrix of $T$ is $B$, the matrix $g(B)$ will be equal to the adjugate matrix $\operatorname{adj}(B)$ as conventionally defined by cofactor calculations.

In your case, the characteristic polynomial of $B=sI-A$ is $$ f(t)=\det(tI-(sI-A))=(-1)^np(s-t) =(-1)^n\sum_{k=0}^np_k(s-t)^k. $$ Therefore $\operatorname{adj}(B)=g(B)$, where $$ g(t)=(-1)^{n+1}\frac{f(t)-f(0)}{t-0}=-\sum_{k=1}^np_k\frac{(s-t)^k-s^k}{t}. $$ It follows that $$\begin{align*} g(s-t)&=-\sum_{k=1}^np_k\frac{t^k-s^k}{s-t} =\sum_{k=1}^np_k\sum_{j=0}^{k-1} s^{k-j-1}t^j\\ &=\sum_{j=0}^n\sum_{k=j+1}^n p_k s^{k-j-1}t^j =\sum_{j=0}^{n-1}\sum_{i=0}^{n-j-1} p_{i+j+1} s^it^j\\ &=\Delta p(s,t). \end{align*}$$ and $\operatorname{adj}(sI-A)=g(sI-A)=\Delta p(s,A)$.