This year I'm teaching an elementary course on linear algebra for physics students. Because of that I have been researching the different ways of presenting the definition of the determinant (one of the hardest topics for such a course, in my opinion).
In the lecture notes by Terry Loring a definition using elementary row operations is given
http://www.math.unm.edu/~loring/links/linear_s06/det_def.pdf
I like this definition since seems to me very algorithmic, and uses an idea which is familiar to the students. Compare this with the other more popular definitions which are
- The explicit formula using the sign of permutations
- The recursive formula using the Laplace development
- The axiomatic one as the unique multilinear alternate form by rows (or by columns) that takes the value 1 at the identity matrix.
Of course, this definition by row elimination seems very close to this last axiomatic definition.
However my question is, do you know if there is some way of deriving the whole theory of determinants from the definition based on row elimination? Indeed, it even seems hard to prove directly that the definition is correct (non ambiguous).
Do you know some book using this approach?
If I remember correctly, Apostol's Calculus Vol. 2 approaches it the way that egreg outlined in his comment. I could check the table of contents on Amazon and verify that the title of Chapter 3 is "Determinants".
I don't have my copy to hand, and it's also possible that instead he gives the axioms requiring it to be a multilinear form with some other behaviors and then shows that row elimination is an effective way to compute it -- it's been a while since I read it. But either way I'll suggest you check it out.