Directly prove strictly column diagonally dominant matrix is nonsingular using matrix norms

647 Views Asked by At

It is the case that, if a matrix $A \in \mathbb{C}^{n \times n}$ is strictly column diagonally dominant, meaning

$$|a_{jj}|> \sum_{i \neq j}^{n}|a_{ij}|, \quad 1 \leq j \leq n $$

then it is nonsingular.

I want to prove this theorem using the fact that, if $B,E \in \mathbb{C}^{n \times n}$ where $B$ is nonsingular and $|B^{-1}E|_{p} < 1$, then $B+E$ is nonsingular. $| \cdot |_{p}$ denotes the operator norms given by the p-norm for vectors. What I am trying to proof should be similar the answer given to this question.

I realize that, if I take the transpose of $A$, then $A^*$ is strictly row diagonally dominant and the answer I linked holds. However, I want to see if this proof can be given as directly as possible using matrix norms, without taking the transpose. I face problems with this, however.

For example, for the $3 \times 3$ case, if I let $A = D + E$, where $D$ is the matrix containing the diagonal entries of A, then

$$D^{-1}E = \begin{pmatrix} 0 & \frac{a_{12}}{a_{11}} & \frac{a_{13}}{a_{11}} \\ \frac{a_{21}}{a_{22}} & 0 & \frac{a_{23}}{a_{22}} \\ \frac{a_{31}}{a_{33}} & \frac{a_{32}}{a_{33}} & 0 \end{pmatrix}$$

But adding the sums of the absolute values of the row or column entries (which gives the infinity and one norm), does not not allow me to use the above inequality.