Determinant and matrix multiplication complexity?

1.1k Views Asked by At

I keep hearing that computing determinant reduces to computing matrix multiplication and complexity of both are same. How is this so and can someone provide an algorithm to compute determinant from matrix multiplication?

1

There are 1 best solutions below

11
On BEST ANSWER

Essentially, fast matrix multiplication can be used as a subroutine for fast LU (or more generally LUP) decomposition, and we can use the LU decomposition to compute determinants. (If $A = LU$, $L$ is lower triangular, and $U$ is upper triangular with a diagonal of all $1$s, then $\det(A)$ is the product of the diagonal entries of $L$.)

For a fast LU decomposition, we have a divide-and-conquer algorithm as follows. Divide the given $2n \times 2n$ matrix into four $n \times n$ matrices and try to factor $$ \begin{bmatrix}A & B \\ C & D\end{bmatrix} = \begin{bmatrix}L_{11} & 0 \\ L_{21} & L_{22}\end{bmatrix} \begin{bmatrix}U_{11} & U_{12} \\ 0 & U_{22}\end{bmatrix}. $$ This is done in four steps:

  • We have $A = L_{11} U_{11}$. So we can find $L_{11}$ and $U_{11}$ by solving an LU factorization problem of half the size.
  • We have $B = L_{11} U_{12}$. Once $L_{11}$ is found, we can invert it in $O(n^2)$ time (since it's triangular) and set $U_{12} = L_{11}^{-1} B$.
  • We have $C = L_{21} U_{11}$. Once $U_{11}$ is found, we can invert it in $O(n^2)$ time (since it's triangular) and set $L_{21} = C U_{11}^{-1}$.
  • We have $D = L_{21} U_{12} + L_{22} U_{22}$, so we can find $L_{22}$ and $U_{22}$ by computing $D - L_{21} U_{12}$, then solving another LU factorization problem of half the size.

In general, an LU factorization might not exist; you may need to permute the rows to get an LUP decomposition. (The reason the above procedure might fail is that one of the inverses might not exist at some stage even when the input matrix is non-singular.) This makes the algorithm more complex, but the theory is the same.

What is the running time of the algorithm above? The processing done on the $2n \times 2n$ matrix requires three $n \times n$ matrix multiplications, some steps which are faster than matrix multiplication, and two $n \times n$ LU factorizations.

So if $L(n)$ is the complexity of this algorithm and $M(n)$ the complexity of your favorite method of matrix multiplication, we have $L(2n) = 2 L(n) + \Theta(M(n))$, and from here $L(n) = \Theta(M(n))$ by the third case of the master theorem.