Can someone prove why a row replacement operation does not change the determinant of a matrix?
**row replacement operation being adding one row to another or something of that sort
Can someone prove why a row replacement operation does not change the determinant of a matrix?
**row replacement operation being adding one row to another or something of that sort
On
One way to think about it, using the property: $\det(AB)=\det(A)\det(B)$:
Adding a multiple of one row to another is equivalent to left multiplication by an elementary matrix.
Let $B$ be some $n\times n$ matrix, $A$ be an $n\times n$ elementary matrix which acts as an operator which adds $k$ copies of row $i$ to row $j$. So applying that same row operation to $B$ will result in the matrix $AB$. Then without a loss of generality, $A$ has the form:
$$\begin{bmatrix}1 & & & & & \\ & \ddots & & & & & \\ & & 1 & & & & \\ & & & \ddots & & & \\ & & k & & 1 & & \\ & & & & & \ddots & \\ & & & & & & 1\end{bmatrix}$$ where $a_{ji}=k$.
The determinant of a triangular matrix is the product of the diagonal. $A$ has a unit diagonal, so $\det(A)=1$.
Therefore, $$\det(AB)=\det(A)\det(B)=1\det(B)=\det(B).$$
On
Short explanation: It is true that if all the elements of a row are linear combinations of (two) other rows, then the determinant of that matrix is equal to a linear combination of (two) determinants. Even better, that works for a linear combination of any number of rows!
Because of this, it is also true that the common factor of a row of a determinant can be factored out of the determinant, which implies that if a row consists entirely of zeros, we factor the zero out, meaning that the determinant becomes zero.
Long explanation:
THEOREM (Linear Property of Determinants). If all the elements of the jth colum of a determinant D are "linear combinations" of two columns of numbers, that is, if $$a_{ij} = \lambda b_i + \mu c_i$$ $(i=1,2,...,n)$, where $\lambda$ and $\mu$ are fixed numbers, then D is equal to a linear combination of two determinants: $$D = \lambda D_1+\mu D_2.$$ Here both determinants $D_1$ and $D_2$ have the same columns as the determinant D, except for the jth column; the jth column of $D_1$ consists of the numbers $b_i$, while the jth column of $D_2$ consists of the numbers $c_i$.
Proof: Use the definition of the determinant as the "algebraic sum of the $n!$ products of the form $a_{\alpha_1 1}a_{\alpha_2 2}...a_{\alpha_n n}$(*), each preceded by the sign determined by its inversions". Substitute the term $a_{\alpha_j j}$ for $(\lambda b_{\alpha_1} + \mu c_{\alpha_j})$ and use the distributive property.
Note: This linear property can easily be extended to the case where EVERY element of the jth column is a linear combination not of two terms but of any other numbers of terms. Then, if $a_{ij} = \lambda b_i + \mu c_i + ... + \tau f_i$, then $D_j(a_{ij} = D_j(\lambda b_i + \mu c_i + ... + \tau f_i) = \lambda D_j (b_i) + \mu D_j (c_i) +...+ \tau D_j (f_i)$
(*): any product of n elements which appear in different rows and different columns of the square matrix (a product containing just one element from each row and just one element from each column).
COROLLARY #1. Any common factor of a column of a determinant can be factored out of the determinant.
Proof: If $a_{ij} = \lambda b_i$, then $D_j(a_{ij}) = D_j(\lambda b_i) = \lambda D_j(b_i)$.
COROLLARY #2. If a column of a determinant consists entirely of zeros, then the determinant vanishes.
Proof:Since 0 is a common factor of the elements of one of the columns, we can factor it out of the determinant, obtaining $$D_j(0) = D_j(0.1) = 0.D_j(1) = 0.$$
THEOREM. The value of a determinant is not changed by adding the elements of one column multiplied by an arbitrary number to the corresponding elements of another column.
Proof:Suppose we add the kth column multiplied by the number $\lambda$ to the jth column $(k \neq j)$. The jth column of the resulting determinant consists of elements of the form $a_{ij} + \lambda a_{ik}$. By $D = \lambda D_1 + \mu D_2$, we have $$D_j (a_{ij} + \lambda a_{ik}) = D_j (a_{ij}) + \lambda D_j(a_{ik}).$$ The jth column of the second determinant consists of the elements a_{ik}, and hence is identical with the kth column. It follows from corollary #2 that $D_j(a_{ik} = 0$, so that $$D_j (a_{ij} + \lambda a_{ik}) = D_j (a_{ij}).$$
Because of the invariance of determinants under transposition, all these properties are valid for rows as well.
In the sense of the last theorem here stated, if you take a row, multiply it by zero and add another row to it, and do the same thing for that column you just added, you are, in essence, replacing two rows. Being so, the determinant leaves unchanged. If by row replacement operation you mean "adding one row to another or something of that sort", then the last theorem stated proves it.
Note: The theorems, corollaries and proofs were taken from the first chapter of the book "Linear Algebra", by Georgi E. Shilov (an intermediate proof-based book on multilinear algebra). http://www.amazon.com/Linear-Algebra-Dover-Books-Mathematics/dp/048663518X/ref=sr_1_sc_1?ie=UTF8&qid=1464396655&sr=8-1-spell&keywords=linear+algebra+shilob
If you have a matrix $\mathrm A \in \mathbb{R}^{n \times n}$ and you replace its $i$-th row by a linear combination of the $n$ rows of $\mathrm A$, this corresponds to left-multiplying $\mathrm A$ by
$$\mathrm E := \mathrm I_n - \mathrm{e}_i \mathrm{e}_i^T + \mathrm{e}_i \mathrm{c}^T = \mathrm I_n + \mathrm{e}_i (\mathrm{c} - \mathrm{e}_i)^T$$
where $\mathrm{e}_i$ is a vector with $n-1$ zeros and with a one on the $i$-th entry, and the entries of $\mathrm{c}$ are the coefficients of the linear combination. Thus, the determinant of the new matrix is
$$\det (\mathrm E \mathrm A) = \det(\mathrm E) \cdot \det (\mathrm A)$$
Using Weinstein-Aronszajn determinant identity,
$$\det (\mathrm E) = \det (\mathrm I_n + \mathrm{e}_i (\mathrm{c} - \mathrm{e}_i)^T) = \det (1 + (\mathrm{c} - \mathrm{e}_i)^T \mathrm{e}_i) = 1 + (c_i - 1) = c_i$$
Hence, replacing a row by a linear combination of rows does change the determinant if the coefficient that multiplies the row being replaced is not $1$. For example, replacing a row by the same row multiplied by $2$ then doubles the determinant.