Row determinants are equal for determinants in square matrices

64 Views Asked by At

For square matrix A = [aij], and when |Mij| means the minor matrix of A, define first row determinant recursively as : $$|A|_1 = \sum_{j=1}^N a_{1j} * |M_{1j}|_1 * (-1)^{j+1} $$ where in the case of N = 1, |A|1 is simply an absolute value of a real number. Define second row determinant recursively as: $$|A|_2 = \sum_{j=1}^N a_{2j} * |M_{2j}|_2 * (-1)^j $$ where N ≥ 2.

Using mathematical induction, show $$ |A|_1 = |A|_2 $$ And hence, argue that $$ |A|_i = |A|_1 $$ for all i that is not equal to 1.

I am completely lost in this problem because of the fact that I have to use mathematical induction and prove that $|A|_i=|A|_1$. Can someone please tell me how to solve the problem and what properties I should use? Thanks in advance!

1

There are 1 best solutions below

0
On

I'm highly skeptical that you've transcribed the problem correctly. As noted in the comments, the determinant of a $1\times 1$ matrix is not (always) an absolute value. Also, your "recursive" definition of the "second row determinant" makes no sense because if the matrix is $2\times 2$, you can't expand along the non-existent second row of a $1\times 1$ submatrix.

Typically one defines a determinant in terms of a row (or column) expansion when trying to prove the existence of an $n\times n$ determinant function by induction on $n$. Such function is desired to satisfy certain properties, like:

  1. The determinant is multilinear in the columns of the matrix.
  2. The determinant is alternating in the columns of the matrix (i.e. is zero when two columns are equal, or equivalently changes sign when two columns are interchanged if the characteristic of the base field is not 2).
  3. The determinant of the identity matrix is 1.

For $n=1$, we can use the function which returns the single entry of the matrix. For $n>1$, if we assume we have a function $|\phantom{A}|$ on $(n-1)\times(n-1)$ matrices satisfying (1)-(3) and $A=(a_{ij})$ is an $n\times n$ matrix, we can define for each $1\le i\le n$ $$|A|_i=\sum_{j=1}^n(-1)^{i+j}a_{ij}|A_{ij}|\tag{4}$$ where $A_{ij}$ is the $(n-1)\times(n-1)$ submatrix obtained from $A$ by deleting row $i$ and column $j$. It's possible to prove from (4) using the induction hypothesis that each function $|\phantom{A}|_i$ satisfies (1)-(3).

This establishes the existence of $n$ functions $|\phantom{A}|_i$ on $n\times n$ matrices satisfying (1)-(3). But it's also possible to prove that there's at most one function satisfying (1)-(3) -- any function satisfying these properties is given by the Leibniz expansion. So all of the functions $|\phantom{A}|_i$ are actually equal, and we just write $|\phantom{A}|$.