Pseudo determinant of product of two square matrices

406 Views Asked by At

Let $A$ and $B$ be square symmetric matrices. $A$ is singular and $B$ is non-singular. Is there a way to decompose:

$Det(AB)$ in terms of $Det(A)$ and $det(B)$.

$Det(.)$ refers to the pseudo determinant and $det(.)$ refers to the usual determinant of a square non-singular matrix.

2

There are 2 best solutions below

0
On

No. For example, consider \begin{align*} A_1 &= \begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix} \\ A_2 &= \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix} \\ B &= \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}. \end{align*} Then, $\operatorname{Det} A_1 = \operatorname{Det} A_2 = 1$, but $\operatorname{Det}(A_1 B) = 1$ and $\operatorname{Det} (A_2 B) = 2$. That is, $\operatorname{Det}(AB)$ is not purely a function of $\operatorname{Det} A$ and $\det B$.

10
On

We can decompose it, but not in terms of $\operatorname{Det}(A)$ and $\det(B)$.

Suppose that $A,B$ are $n \times n$ and let $k = \dim \ker A$ We note that $\operatorname{Det}(A) = \operatorname{Tr}(\wedge^k(A))$, where $\wedge$ denotes the exterior product. It follows that $$ \operatorname{Det}(AB) = \operatorname{Tr}(\wedge^k(AB)) = \operatorname{Tr}(\wedge^k(A)\wedge^k(B)) $$ This can be expressed as the following sum. Let $S_k$ denote the subsets of $\{1,\dots,n\}$ with size $k$. For any $R,C \in S_k$, let $A[R,C]$ denote the submatrix of $A$ consisting of the rows from $R$ and columns from $C$. With that, we have $$ \operatorname{Det}(AB) = \sum_{R \in S_k} \det((AB)[R,R]) = \sum_{R,C \in S_k} \det(A[R,C])\det(B[C,R]). $$ This is an inconvenient sum in general, but it is reasonable if $k$ is either small or close to $n$. For small $k$, however, there are more efficient methods.


Where $A$ has rank $n-1$, we can sometimes write this nicely in terms of the adjugate matrix. In particular, we have $$ \operatorname{Det}(AB) = \operatorname{tr}(\operatorname{adj}(AB)) = \operatorname{tr}(\operatorname{adj}(B)\operatorname{adj}(A)) = \det(B)\cdot \operatorname{tr} (B^{-1}\operatorname{adj}(A)). $$ Moreover, in the case that $A$ has eigenvalue $0$ with algebraic multiplicity $1$, if we have a unit vector $x$ that spans the kernel of $A$ and a unit vector $y$ that spans the kernel of $A^T$ and if $x^Ty \neq 0$ (which is guaranteed if $\operatorname{Det}(A) \neq 0$), then we can write $\operatorname{adj}(A) = \frac{\operatorname{Det}(A)}{x^Ty}\cdot xy^T$ so that $$ \operatorname{Det}(AB) = \frac{1}{x^Ty}\det(B)\cdot \operatorname{tr} (B^{-1}\operatorname{Det}(A)\cdot xy^T ) = \frac 1{x^Ty}\det(B) \operatorname{Det}(A) \cdot y^TB^{-1}x \\ = \det(B) \operatorname{Det}(A) \cdot \frac{y^TB^{-1}x}{y^Tx}. $$