Prove case for why LU Decomposition fails

91 Views Asked by At

I understand that for some square matrices, a LU decomposition (not talking about LUP) can fail to exist.

For example, the following matrix has no LU decomposition $$ \begin{bmatrix} 0 & 1 \\ 2 & 1 \end{bmatrix} $$

Wikipedia states that an $n \times n$ matrix "has no LU factorization if the first (n−1) columns are non-zero and linearly independent and at least one leading principal minor is zero"

I'm confused as to why this is the case. I.e., is there a more intuitive explanation out there that explains the general case for why a square matrix might not have a LU decomposition?