I was thinking about a $100\times 100$ matrix. It seems true that it should require $100!$ computations. Because I start with $100$ determinants of $99$ matrices. Then, $99$ of $88$ and so on. So $100!$ seemed to me right.
I used Google to search for it and saw others on CS saying it, too. That it would take a computer $100!$ computations of determinants for running time.
Point is, later I thought it's all wrong. You can reduce the number significantly. If you were to start from the end. You can count just the $2\times 2$ determinant and then go up. Now, the $2\times2$ matrices are reusable. You're gonna use many of them a few times.
I am a bit stuck here. I am new to matrices so I need someone to clarify it. Are there really $n!$, or in this case $100!$, determinants to compute or is it really possible to go from the bottom of matrix by computing the $2\times2$ matrices and going up? So that we would have much less than $100!/2$ determinants of $2\times2$ to compute and then much less $3\times3$ and so on? So will it come out far less than $100!$?
Large determinants are never computed by minor expansion, which indeed costs $C.N!$ operations where $C$ is a small constant (there is no reuse). $100!$ elementary operations takes much longer than the lifetime of the universe (I tried it).
Fortunately, Gaussian elimination comes to the rescue and only requires $C.N^3$ operations. $100^3$ is nothing.