Augmenting a matrix with a column and a row — what happens to the determinant?

288 Views Asked by At

Let $\mathbb A = (a_{i,j})_{1\leq i \leq n, 1\leq j \leq n}$ be a real matrix such that $\det(\mathbb A)\neq0$. Suppose we add a column $(a_{i,0})_{1\leq i \leq n+1} $ and a row $(a_{n+1,j})_{0\leq j \leq n+1} $ to $\mathbb A$.

  1. Under what sufficient conditions on the newly added numbers do we still have $\det(\mathbb A)\neq0$ ?

  2. Under what sufficient and necessary conditions on the newly added numbers do we still have $\det(\mathbb A)\neq0$ ?

2

There are 2 best solutions below

1
On BEST ANSWER

Essentially, you're considering $A'=\begin{pmatrix} a_0 & b^\top\\ c & A \end{pmatrix}$ for scalar $a_0$, column vectors $b,c$ and a nonsingular matrix $A$. A tool which is applicable for matrices of this form is the Schur complement. This is based on performing block-Gaussian elimination. In particular, we note the following block-diagonalization of $A'$:

$$\begin{pmatrix}1 & -b^\top A^{-1} \\ 0 & I_n\end{pmatrix} \begin{pmatrix} a_0 & b^\top\\ c & A \end{pmatrix} \begin{pmatrix}1 & 0\\ -A^{-1}c& I_n\end{pmatrix} =\begin{pmatrix}a_0-b^\top A^{-1}c & 0 \\ 0 & A\end{pmatrix}$$ which applies so long as $A$ is invertible. One may then check that the block-triangular matrices have determinant $1$, so $$\det(A')=(a_0-b^\top A^{-1}c)\det(A).$$ Hence $A'$ is invertible if and only if $a_0 \neq c^\top A^{-1}b$.

2
On

This is my attempt:

Let $A$ be your matrix, then $\det A\neq 0$. Let me add the new on the left and the row on top, then the new matrix is given by $N=\begin{bmatrix}a & b \\c & A \end{bmatrix}$, where $a\in\mathbb{R}$, $c$ is a column vector and $b$ is a row vector. Then $b=k_1A_{1}+k_2A_{2}+...+k_nA_{n}$ ($A_k$ is the kth row of $A$), that is $b$ can be written as a linear combination of the rows of $A$.

Row operations don't change whether a determinant is $0$ or not, hence you can perform row operations on $b$ until you turn it into a row of $A$. Namely:

$$\begin{bmatrix}a' & A_k \\c & A \end{bmatrix}$$

Here, we have $a'$, instead of $a$ because we performed row operations.

For $\det N\neq 0$, you need that all your rows to be linearly independent. Then you need that $a'\neq c_k$. Hope this helps!

As an example consider $A=\begin{bmatrix}1 & 0 \\ 0 & 1\end{bmatrix}$, let $b=\begin{bmatrix}2& 3\end{bmatrix}$, then: $$\begin{bmatrix}a & 2 & 3 \\ c_1 & 1 & 0 \\ c_2 & 0 & 1 \end{bmatrix}$$ We want to apply row operations to turn $b$ into one of the rows of $A$, for instance: $$\begin{bmatrix}a & 2 & 3 \\ c_1 & 1 & 0 \\ c_2 & 0 & 1 \end{bmatrix} \rightarrow \begin{bmatrix}a/3 & 2/3 & 1 \\ c_1 & 1 & 0 \\ c_2 & 0 & 1 \end{bmatrix}\rightarrow \begin{bmatrix}a/3-2c_1/3 & 0 & 1 \\ c_1 & 1 & 0 \\ c_2 & 0 & 1 \end{bmatrix}$$

Then the condition would be that $a/3-2c_1/3\neq c_2$. For the sake of verifying, let's violate the condition, so $3c_2+2c_1=a$ (after doing some algebra). We can have $a=5,c_1=1,c_2=1$, then the matrix is: $$\det \begin{bmatrix}5 & 2 & 3 \\ 1 & 1 & 0 \\ 1 & 0 & 1 \end{bmatrix}=5-2-3=0$$