Polynomial form of $\det(A+xB)$

2.6k Views Asked by At

Let $A$ and $B$ be two $2 \times 2$ matrices with integer entries. Prove that $\det(A+xB)$ is an integer polynomial of the form $$P(x) = \det(A+xB) = \det(B)x^2+mx+\det(A).$$

I tried expanding the determinant of $\det(A+xB)$ for two arbitrary matrices, but it got computational. Is there another way?

3

There are 3 best solutions below

2
On

The constant term $\mathrm{det}(A)$ comes from setting $x = 0.$ The coefficient $\mathrm{det}(B)$ is the constant term of $x^2 P(1/x) = \mathrm{det}(xA + B),$ again setting $x = 0.$

0
On

$$\begin{pmatrix}a_{11}&&a_{12}\\a_{21}&&a_{22}\end{pmatrix}+x\begin{pmatrix}b_{11}&&b_{12}\\b_{21}&&b_{22}\end{pmatrix}=\begin{pmatrix}a_{11}+xb_{11} &&a_{12}+xb_{12}\\a_{21}+xb_{21}&&a_{22}+xb_{22}\end{pmatrix}=C$$ $$\det(C)= a_{11}a_{22}+a_{11}xb_{22}+a_{22}xb_{11}+x^2b_{11}b_{22}- a_{21}a_{12}-a_{21}xb_{12}-a_{12}xb_{21}-x^2b_{21}b_{12} $$ $$=\det(A)+x\left[\det\begin{pmatrix}a_{11}&&a_{12}\\b_{21}&&b_{22}\end{pmatrix}+\det\begin{pmatrix}b_{11}&&b_{12}\\a_{21}&&a_{22}\end{pmatrix}\right]+x^2\det(B)$$

6
On

Another approach. All matrices are complex valued. This argument can be extended to $n\times n$ matrices: see the end of the post. (EDIT sorry, I have found another error, the general case is incomplete).

First step: Assume that $A=I$ and let $\lambda_1, \lambda_2$ denote the eigenvalues of $B$. (They might not be distinct, this does not affect the argument). Then $$\tag{1}\det(I+xB)=(1+x\lambda_1)(1+x\lambda_2)=1 + x\,\mathrm{trace} (B) + x^2 \det B.$$ Here we use the fact that the eigenvalues of $I+xB$ are $1+x\lambda_1, 1+x\lambda_2$ and that the determinant equals the product of the eigenvalues.

Second step: Now assume that $A$ is invertible. Write $$\tag{2}\det(A+xB)=\det(A)\det(I+xA^{-1}B) = \det(A) + x\,\mathrm{trace}(\det(A)A^{-1}B) + x^2\det(B).$$ Here we used the first step together with Binet's theorem (i.e., $\det(MN)=\det(M)\det(N)$).

Final step Now we remove the invertibility assumption on $A$ with a continuity argument. Note that, if $A=\begin{bmatrix} a & b \\ c & d\end{bmatrix}$ is invertible, one has $$\det(A)A^{-1}=\begin{bmatrix} d & -b \\ -c & a\end{bmatrix}.$$ If one rewrites formula (2) as follows $$\tag{3} \det(A+xB)=\det(A) + x\, \mathrm{trace}\left( \begin{bmatrix} d & -b \\ -c & a\end{bmatrix}B\right) + x^2 \det(B), $$ one sees at once that it holds true for all matrices, not only for invertible ones. More precisely, the formula (3) is true for all invertible $A$ and invertible matrices are a dense subset of the space of all matrices, so the formula extends by continuity.

When all matrices are integer valued, the term $$m=\mathrm{trace}\left( \begin{bmatrix} d & -b \\ -c & a\end{bmatrix}B\right)$$ is integer and this terminates the proof.

The $n\times n$ case. In the general case, one obtains in the $A=I$ case the formula \begin{equation}\begin{split}\det(I+xB)&=1+\mathrm{trace}(B)x + \sum_{i<j} \lambda_i\lambda_j x^2+\ldots +\sum_{i_1<\ldots<i_{n-1}} \lambda_{i_1}\ldots\lambda_{i_{n-1}}x^{n-1} + \det(B)x^n \\ &= 1+\mathrm{trace}(B)x+p_2(B)x^2+\ldots + p_{n-1}(B)x^{n-1}+\det B x^n.\end{split}\end{equation}

The coefficients $p_2(B)\ldots p_{n-1}(B)$ are the sums of all principal minors of $B$ of order $2,\ldots n-1$ respectively.

Therefore, using the same argument as before, we obtain in the case of invertible $A$ the formula $$ \det(A+xB)=\det(A) + \mathrm{trace}(\det A A^{-1}B) x+ \det A p_2( A^{-1}B)x^2+\ldots+ \det A p_{n-1}( A^{-1}B)x^{n-1} + \det (B) x^n.$$

To remove the invertibility assumption on $A$, the formula one should use is the following: $$\det(A)A^{-1}=\mathrm{adj}(A), $$ where $\mathrm{adj}(A)$ denotes the adjugate matrix of $A$.

The final result is (warning incomplete) $$\tag{4} \det(A+xB)=\det(A) + \mathrm{trace}(\mathrm{adj}(A)B) x+ \det A p_2(A^{-1}B)x^2+\ldots +\det A p_{n-1}(A^{-1}B)x^{n-1} + \det (B) x^n.$$

TO DO Find formulas for $\det A\cdot p_{2}(A^{-1}B)m \ldots \det A\cdot p_{n-1}(A^{-1}B)$ which do not rely on invertibility of $A$.