After looking in my book for a couple of hours, I'm still confused about what it means for a $(n\times n)$-matrix $A$ to have a determinant equal to zero, $\det(A)=0$.
I hope someone can explain this to me in plain English.
After looking in my book for a couple of hours, I'm still confused about what it means for a $(n\times n)$-matrix $A$ to have a determinant equal to zero, $\det(A)=0$.
I hope someone can explain this to me in plain English.
On
There is a geometrical interpretation for the determinant. In addition to the interpretation in the another answer, yet another appealing one related to the determinant is its interpretation as the volume of a $N$ dimensional parallelopipped. This is more expressed in 3-Dimensions. If you take $3$ vectors in 3-D, they may or may not form the corners of a parallelopipped, if you take the determinant of a matrix with this 3 vectors as the columns (or rows), if the determinant is zero, it means, they don't form a parallelopipped together, if it is non-zero, it means, they indeed form the 3 edges of a parallelopipped with the volume given by the determinant. The sign of the value of determinant gives a kind of information on the orientation of this body.
On
Another way of putting it:
If you take $2$ vectors in $2D$ space, you can show that the area of the parallelogram formed is simply the determinant of the matrix formed by those two vectors. This is a general result for $n$-dimensions - the determinant of a matrix is the volume of the $n$-parallelogram formed by the rows of the matrix.
If the determinant is zero, this means the volume is zero. This can only happen when one of the vectors "overlaps" one of the others or more formally, when two of the vectors or linearly dependent.
On
Let $A$ be a matrix.
Then, $$A^{-1} = \dfrac{1}{detA} adjA$$
Hence, if determinant is zero, Inverse doesn't exist for that matrix.
On
For an $n\times n$ matrix, each of the following is equivalent to the condition of the matrix having determinant $0$:
The columns of the matrix are dependent vectors in $\mathbb R^n$
The rows of the matrix are dependent vectors in $\mathbb R^n$
The matrix is not invertible.
The volume of the parallelepiped determined by the column vectors of the matrix is $0$.
The volume of the parallelepiped determined by the row vectors of the matrix is $0$.
The system of homogenous linear equations represented by the matrix has a non-trivial solution.
The determinant of the linear transformation determined by the matrix is $0$.
The free coefficient in the characteristic polynomial of the matrix is $0$.
Depending on the definition of the determinant you saw, proving each equivalence can be more or less hard.
On
The determinant of a matrix is the oriented volume of the image of the unit cube. If it is zero, the unit cube gets mapped inside of a plane and has volume zero.
On
From Gaussian elimination perspective: For any matrix, there are elementary matrices $E_r,...,E_1$ corresponding to some elemantary row operations such that $$E_r...E_1A=A_{red}$$ where $A_{red}$ is the reduced row echelon form or simply the reduced form of the matrix.
If $A$ is square matrix and the determinant of $A$ is zero, then:
If the determinant of a square matrix $n\times n$ $A$ is zero, then $A$ is not invertible. This is a crucial test that helps determine whether a square matrix is invertible, i.e., if the matrix has an inverse. When it does have an inverse, it allows us to find a unique solution, e.g., to the equation $Ax = b$ given some vector $b$.
When the determinant of a matrix is zero, the system of equations associated with it is linearly dependent; that is, if the determinant of a matrix is zero, at least one row of such a matrix is a scalar multiple of another.
[When the determinant of a matrix is nonzero, the linear system it represents is linearly independent.]
When the determinant of a matrix is zero, its rows are linearly dependent vectors, and its columns are linearly dependent vectors.