Linear independence and determinants

78 Views Asked by At

I'm trying to do exercise 6.8 from the book "Advanced Calculus of Several Variables", by Edwards. There it is stated that: Let $\boldsymbol{a}_{i}=\left(a_{i1}, a_{i2}, \ldots, a_{in} \right), i=1,\ldots,k<n$, be $k$ linearly dependent vectors in $\mathbb{R}^{n}$. Then show that every $k \times k$ submatrix of the matrix: \begin{equation*} \begin{pmatrix} a_{11} &\cdots & a_{1n}\\ \vdots & & \vdots\\ a_{k1} & \cdots & a_{kn} \end{pmatrix} \end{equation*} has zero determinant.

Any hint on how this can be solved? (If you have access to the book, any hint that uses the material covered in the book up to that chapter is highly appreciated).

2

There are 2 best solutions below

4
On BEST ANSWER

if $a_1,\cdots, a_k$ are dependent.

There exists a non-trival set of scalars $c_1,\cdots, c_k$ such $c_1a_1 + \cdots + c_ka_k = \mathbf 0$

or

$\begin{bmatrix} c_1&\cdots &c_k \end{bmatrix} \begin{bmatrix} a_{11} &\cdots &a_{1n}\\ \vdots&\ddots&\vdots\\a_{k1} & \cdots & a{kn}\end{bmatrix} = \begin {bmatrix}0&\cdots &0\end{bmatrix}$

Suppose we choose $k$, columns from matrix $A,$ call this submatrix $A_k$

It would still be the case that $c^TA_k = \mathbf 0$ And $A_k$ is a singular matrix.

If it is singular, its determinant is 0.

0
On

A $k \times k$ submatrix is precisely what you get from deleting columns of your above matrix. You have a linear dependency between the rows of your $k \times n$ matrix by assumption and that say linear dependency of the rows gives a linear dependency of the rows of any $k \times k$ submatrix that you get by deleting the columns (just look at each column/coordinate individually).