How can I prove that a swap matrix (a matrix which multiplied with another, swaps a pair of its rows/columns) has determinant $-1$?
2026-02-23 01:07:30.1771808850
Prove that the determinant of a swap matrix is $-1$
911 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in LINEAR-ALGEBRA
- An underdetermined system derived for rotated coordinate system
- How to prove the following equality with matrix norm?
- Alternate basis for a subspace of $\mathcal P_3(\mathbb R)$?
- Why the derivative of $T(\gamma(s))$ is $T$ if this composition is not a linear transformation?
- Why is necessary ask $F$ to be infinite in order to obtain: $ f(v)=0$ for all $ f\in V^* \implies v=0 $
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Summation in subsets
- $C=AB-BA$. If $CA=AC$, then $C$ is not invertible.
- Basis of span in $R^4$
- Prove if A is regular skew symmetric, I+A is regular (with obstacles)
Related Questions in MATRICES
- How to prove the following equality with matrix norm?
- I don't understand this $\left(\left[T\right]^B_C\right)^{-1}=\left[T^{-1}\right]^C_B$
- Powers of a simple matrix and Catalan numbers
- Gradient of Cost Function To Find Matrix Factorization
- Particular commutator matrix is strictly lower triangular, or at least annihilates last base vector
- Inverse of a triangular-by-block $3 \times 3$ matrix
- Form square matrix out of a non square matrix to calculate determinant
- Extending a linear action to monomials of higher degree
- Eiegenspectrum on subtracting a diagonal matrix
- For a $G$ a finite subgroup of $\mathbb{GL}_2(\mathbb{R})$ of rank $3$, show that $f^2 = \textrm{Id}$ for all $f \in G$
Related Questions in DETERMINANT
- Form square matrix out of a non square matrix to calculate determinant
- Let $T:V\to W$ on finite dimensional vector spaces, is it possible to use the determinant to determine that $T$ is invertible.
- Optimization over images of column-orthogonal matrices through rotations and reflections
- Effect of adding a zero row and column on the eigenvalues of a matrix
- Geometric intuition behind determinant properties
- Help with proof or counterexample: $A^3=0 \implies I_n+A$ is invertible
- Prove that every matrix $\in\mathbb{R}^{3\times3}$ with determinant equal 6 can be written as $AB$, when $|B|=1$ and $A$ is the given matrix.
- Properties of determinant exponent
- How to determine the characteristic polynomial of the $4\times4$ real matrix of ones?
- The determinant of the sum of a positive definite matrix with a symmetric singular matrix
Related Questions in PERMUTATION-MATRICES
- Storing permutation matrix in a vector form
- Prove that the determinant of a swap matrix is $-1$
- How to find a $3\times 3$ permutation matrix?
- Can you completely permute the elements of a matrix by applying permutation matrices?
- Finding eigenvalues and there properties of permutation matrix
- How do you find all solutions to the matrix equation $XAX=A^T$?
- Project an orthogonal matrix onto the Birkhoff Polytope
- Nearest signed permutation matrix to a given matrix $A$
- Permutation and permutation matrices
- Characterisation of permutation matrices
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
There are a few ways to define the determinant, and the proof changes accordingly.
Eigenvalues
The determinant is defined to be the product of the (complex) eigenvalues to the power of their multiplicities (the dimension of the corresponding generalised eigenspace).
Let $e_i$ be the $i$th standard basis vector. In this case, if the $n \times n$ swap matrix swaps the $i$th and $j$th column ($i < j$), then the vectors $$e_1, \ldots, e_{i-1}, e_{i+1}, \ldots, e_{j-1}, e_{j+1}, \ldots, e_n$$ are all linearly independent eigenvectors corresponding to eigenvalue $1$, as is the vector $e_i + e_j$. Meanwhile, the vector $e_i - e_j$ is an eigenvector corresponding to eigenvalue $-1$.
Hence, we have eigenvalue $1$ with multiplicity at least $n - 1$, and eigenvalue $-1$ with multiplicity at least $1$. Since we have found at least $n$ linearly independent eigenvalues, we have a complete list of eigenvalues and their multiplicities. Hence, the determinant is $1^{n - 1} \cdot (-1)^1 = -1$.
Cofactor Expansion
We can use the fact that cofactor expansions can be made along any row or column, with an appropriate change of sign. Note that the diagonal entries are always counted positively. Note also that, expanding along a row that isn't being swapped, the diagonal $1$ is the only non-zero entry, and the cofactor is a swap matrix of a smaller dimension. For example, expanding along the 3rd row: $$\begin{vmatrix}1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0\end{vmatrix} = \begin{vmatrix}1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0\end{vmatrix}.$$ By inductively reducing, you get to the $2 \times 2$ case, which can be computed easily.
Leibniz Formula
The determinant of an $n \times n$ matrix $(a_{i,j})_{i,j = 1}^n$ can be defined as follows: $$\sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n a_{i, \sigma(i)},$$ where $\operatorname{sgn}(\sigma)$ returns $1$ when $\sigma$ is even, and $-1$ when $\sigma$ is odd. Note that the swap matrix can be expressed as a permutation matrix: $$a_{i,j} = \delta_{i, \tau(i)},$$ where $\delta_{i, j}$ returns $1$ when $i = j$ or $0$ otherwise (i.e. the entries of the identity matrix), and $\tau$ is a transposition. Therefore, multiplying a permutation by $\tau$ causes it to change from odd to even. Hence, \begin{align*}\sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n a_{i, \sigma(i)} &= \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n \delta_{i, \sigma \circ \tau (i)} \\ &= - \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma \circ \tau) \prod_{i=1}^n \delta_{i, \sigma \circ \tau (i)} \\ &= - \sum_{\sigma \in S_n} \operatorname{sgn}(\sigma) \prod_{i=1}^n \delta_{i, \sigma(i)} \\ &= - \operatorname{det}(I) = -1. \end{align*}