Are alternate theories of determinants equivalent?

81 Views Asked by At

Let $d: \mathbb F^{m \times n} \to \mathbb F$ be a function from real matrices to a scalar which is row linear, meaning:

(i) For any matrix $M,$ if $M'$ is $M$ with a single row multiplied by arbitrary scalar $r$, then $d(M) = r d(M')$.

(ii) For any matrix $M$, if there exist two matrices $P, Q$, such that one particular row of $M$ is the sum of the corresponding rows in $P$ and $Q$, and all other rows in $M$ are equal to the corresponding row in $P$ which is equal to the corresponding row in $Q$, then $d(M) = d(P) + d(Q)$.

Prove that the following two properties are equivalent:

(a) If $M$ is identical to $M'$ except one row is interchanged, then $d(M) = -d(M')$.

(b) If $M$ has any two rows identical, $d(M) = 0$.

Source: Based on https://ocw.mit.edu/ans7870/18/18.013a/textbook/HTML/chapter04/section01.html . The goal is to develop the theory of determinants from first principles.

My partial proof is below. I request help completing it, as well as writing it in a way which is clear. (I'll add that even the problem statement is not worded in a way I'm happy with.)

Clearly, property (a) implies property (b). Consider matrix $A$ with rows $i$ and $i'$ identical. Let $d(A) = k$. Interchanging row $i$ and $i'$ yields the same matrix, but, by property (a), must have determinant $-k$. Thus, $k = 0$.

We now show that property (b) implies property (a). Consider matrices $A, B$ with all rows equal except that rows $i, i'$ are interchanged. Let $C = A+B$. By property (b), $d(C) = 0$, since rows $i, i'$ of $C$ are identical. If all rows other than $i$ were zero, it would be easy to complete the proof, leveraging the row linearity properties. But I'm struggling to complete the proof for cases where the rest of the matrix is not zero. I believe it could be done via induction over all non-zero rows, but am having trouble completing it.

How can this proof be completed? And, what is a clearer way to write it?


Update

Thank you for the reference to Antisymmetric vs. alternating $k$-linear forms and wedge-product . From there, I learned some, but not all, of what I need.

What I learned:

  • Property (a) is called antisymmetric
  • Property (b) is called alternating
  • In multi-linear forms, antisymmetric is equivalent to alternating

Where I still need help:

From the suggested, I sense that my properties (i) and (ii) are enough to make $d$ a multi-linear form. I've encountered the terms bilinear form and alternating form, but don't know what they really are (how do they differ from linear functions? do we treat each row as a linear input, or each entry?). They're certainly beyond the scope of the source where I drew the problem from.

The linked suggestion states

We can "FOIL" this out with the distributive property ($f$ is multilinear)

but I don't see how I've proven distributive property. I can infer that the recommended plan is something like:

  1. Use properties (i) and (ii) to prove $d$ is distributive
  2. Use that to "FOIL out" more complicated matrices into my simple case where all rows but $i$ are $0$
  3. Complete the proof from there.

But I'm still struggling to fill in the details.

Can you confirm that I have the plan correct?

I'm also surprised that such an introductory set of notes would assign something requiring machinery which seems so complex (at least to me). Is there a simpler approach? Or at least a simpler way of viewing this one?


Update 2

This text goes over the proof (and the definition of multilinear in this context) more clearly:

Multilinearity Property Let $i$ be a whole number between $1$ and $n$, and fix $n−1$ vectors $v_1, v_2, ..., v_{i-1}, v_{i+1},...,v_n$ in $\mathbb R^n$. If $f(v_1, v_2, ..., v_{i-1},x v_{i+1},...,v_n)$ is linear in $x$ for all $i$, then $f$ is multilinear [abridged from source]

In more theoretical treatments of the topic, where row reduction plays a secondary role, the defining properties of the determinant are often taken to be:

  1. The determinant is multilinear in the rows of $A$
  2. If $A$ has two identical rows, then its determinant is zero
  3. The determinant of the identity matrix is equal to one. [abridged]

and proceeds to show a simple proof of equivalence of definitions. (The proof seems equivalent to what's discussed here, but presented simply and directly.)