$48$ reasons why a matrix is singular

2.9k Views Asked by At

Currently using the MIT recordings from Prof.Strang to review deduction of the determinant formula.

Let's say we take a closer look at

$$ \left\vert\begin{array}{c c} a & 0\\ c & 0 \end{array}\right\vert $$

And to quote from the transcript:

Why is that determinant nothing, forget him?

Well, it has a column of zeros.

And by the -- well, so one way to think is, well, it's a singular matrix.

Oh, for, for like forty-eight different reasons. That determinant is zero.

Now $48$, that's an oddly specific number. And while I do believe he was exaggerating, it did make me curious about just how many reasons (the determinant's being zero not included, of course) there really are for that matrix' being singular.

What are they?

2

There are 2 best solutions below

2
On BEST ANSWER

Here are $23$ equivalent conditions for a matrix to be nonsingular; just take the negation of each statement and you have $23$ conditions for a matrix to be singular:

http://mathworld.wolfram.com/InvertibleMatrixTheorem.html

0
On

I am sure that 48 is a humorously specific way of saying "many ways." In the lecture context described, I interpret the question to mean, "In how many different ways can someone trained in linear algebra see, at a glance, that this matrix is singular?" Two reasons should count as different so long as the mental process of checking them is even a little bit different, even though they are probably logically equivalent (almost trivially so).

To avoid casework, I'll presume for simplicity's sake that $a\neq 0\neq c$ in $A=\left[\begin{matrix}a&0\\c&0\end{matrix}\right]$.

  1. There is a column of 0's.
  2. The second row $[c\;0]$ is a scalar multiple of the first row $[a\;0]$.
  3. The second column is a scalar multiple of the first column.
  4. The rows are not linearly independent, since $c[a\;0] -a[c\;0]=[0\;0]$.
  5. The columns are not linearly independent, since $0\left[\begin{matrix}a\\c\end{matrix}\right]+1\left[\begin{matrix}0\\0\end{matrix}\right] = \left[\begin{matrix}0\\0\end{matrix}\right]$.
  6. Quickly row-reducing the matrix yields that its reduced echelon form is $\left[\begin{matrix}1&0\\0&0\end{matrix}\right]\neq I$.
  7. The product $\left[\begin{matrix}a&0\\c&0\end{matrix}\right] \left[\begin{matrix}0\\1\end{matrix}\right]=\left[\begin{matrix}0\\0\end{matrix}\right]$.
  8. The map $\vec{x}\mapsto A\vec{x}$ is not one-to-one, because $A\left[\begin{matrix}0\\1\end{matrix}\right] = A\left[\begin{matrix}0\\0\end{matrix}\right]$.
  9. Since $x_1 \begin{bmatrix}a\\c\end{bmatrix} + x_2 \begin{bmatrix}0\\0\end{bmatrix} = x_1\begin{bmatrix}a\\c\end{bmatrix}$, the column space of $A$ is one-dimensional (all scalar multiples of $\begin{bmatrix}a\\c\end{bmatrix}$), not $\mathbb{R}^2$.
  10. The row space is clearly $\left\{\begin{bmatrix}x&0\end{bmatrix}: x\in\mathbb{R}\right\}\neq\mathbb{R}^2$, so $A$ is not invertible.
  11. Since $\begin{bmatrix}a&0&1\\c&0&0\end{bmatrix}$ row-reduces to $\begin{bmatrix}1&0&0\\0&0&1\end{bmatrix}$, the equation $A\vec{x}=\begin{bmatrix}1\\0\end{bmatrix}$ has no solution.
  12. Since $A\begin{bmatrix}0\\1\end{bmatrix}=\vec{0}=0\begin{bmatrix}0\\1\end{bmatrix}$, we see that $0$ is an eigenvector of $A$.

I'm far short of 48, but I'm getting tired already.