The time complexity for calculating the determinant of a $n \times n$ matrix (with some additional properties on its entries) is known to be in $O(n!)$. What is a good reference on the topic? The derivations might not be so obvious.
Let's quickly see what could be done:
- for a $2 \times 2$ matrix, I'm counting 3 basic operations: two products and one subtraction
- for a $3\times 3$ matrix, I'm counting, using Laplace expansion, 3 times $2 \times 2$ determinants plus 3 multiplications (the $2\times 2$ determinants times the cofactors) plus 2 subtractions/additions, that is a total of 14 operations
- for a $n\times n$ matrix, I'm counting $n$ times $(n-1)\times (n-1)$ determinants plus $n$ multiplications plus $n-1$ subtractions/additions. All in all, if I call $u_{n-1}$ the complexity of calculating the $(n-1)\times (n-1)$ determinants, then I get the following recurrence: $u_{n}=nu_{n-1}+2n-1$
Now, I'm supposed to show that $u_n \in O(n!)$ and this is where I am stuck.
As you mentioned, a two by two matrix is just two multiplications and one subtraction. Using expansion by cofactors, the determinant for a three by three matrix is three sums of a multiple of three 2 by 2 determinants. The determinant by 4x4 is 4 sums of a multiple of four three by three determinants. O(n!) therefor follows. This can be done faster though.
Reduced row echelon form of a matrix is O(n^3). The determinant is the product of determinants of those n elementary row operations which strictly involve rescaling. Consequently, finding the determinant is also O(n^3). So is finding the inverse.