If a same variable $t$ is added to all entries of a matrix $A$, express $\det(A)$ as a certain first degree polynomial in $t$.

69 Views Asked by At

Prove that if the same number t is added to each element of the matrix, then the determinant of the new matrix is equal to the determinant of the original matrix plus t, multiplied by the sum of cofactors of all elements of the original matrix.

1

There are 1 best solutions below

1
On BEST ANSWER

Some notations : Let $\mathbf{1}$ be the column vector of ones and $\text{com}(M)$ the matrix of cofactors of $M$ (also called "co-matrix"). Let $\Omega= \mathbf{1}^T\mathbf{1}$ be the matrix with all its entries equal to one. Let $s$ be the sum of all cofactors of a matrix $A$.

We have to prove that

$$\det(A+t\Omega)=\det(A)+ts \tag{*}$$

Here are 2 proofs :

First proof :

Let us recall Cramer's formula :

$$A^{-1}=\dfrac{1}{\det(A)}(\text{com}(A))^T \ \ \text{written under the form} \ \ \ \text{com}(A)=\det(A).(A^{-1})^T\tag{1}$$

Besides, it is easy to establish that the sum of all entries of a matrix $M$ can be written

$$\mathbf{1}^TM\mathbf{1} \tag{2}$$

Consequently, combining (1) and (2),

$$s:=\det(A)\mathbf{1}^T(A^{-1})^T\mathbf{1}\tag{3}$$

The identity we are going to establish is a consequence of the matrix-determinant lemma under the form :

$$\det(I+uv^T)=1+u^Tv\tag{4}$$

(where $u$ and $v$ are column vectors).

Let us take $v:=\mathbf{1}$ and $u:=tA^{-1}\mathbf{1}$. (1) becomes :

$$\det(I+tA^{-1}\mathbf{1}\mathbf{1}^T)=1+t \ \mathbf{1}^T(A^{-1})^T\mathbf{1}$$

Multiplying LHS and RHS by $\det(A)$, one gets :

$$\det(A(I+tA^{-1}\Omega))=\det(A)+t \ \det(A)\mathbf{1}^T(A^{-1})^T\mathbf{1}$$

(denoting by $\Omega$ the matrix whose all entries are equal to $1$).

Using (2), we have established (*)

Final remarks :

1) matrix-determinant lemma is sometimes called Sylvester's determinant equation, see for example (https://mathoverflow.net/q/186672).

2) We have assumed $A$ invertible. It isn't compulsory, as will be seen in the second proof.


Second proof :

Let us take, for the sake of simplicity the case of a $3 \times 3$ matrix.

We can write, by calling $A_k$ the $k$th column of $A$ :

$\det (A + t\Omega) = \det[A_1+t\mathbf{1},A_2+t\mathbf{1},A_3+t\mathbf{1}] $

$=\underbrace{\det[A_1,A_2,A_3]}_{\det(A)} +t \underbrace{(\det[\mathbf{1},A_2,A_3]+\det[A_1,\mathbf{1},A_3]+ \det[A_1,A_2,\mathbf{1}])}_{s'} \tag{5} $

by multilinearity of $\det$ function ; all other terms having at least two $t\mathbf{1}$, like $\det[t\mathbf{1},A_2,t\mathbf{1}]$ are zero because a determinant with two identical columns is zero.

Besides, it is not difficult to see that $s'=s$ because, by (Laplace) expansion of the different determinants with respect to their "$\mathbf{1}$" columns, we get, by groups of three, all the cofactors of $A$.