Proof: $\det\pmatrix{\langle v_i , v_j \rangle}\neq0$ $\iff \{v_1,\dots,v_n\}~\text{l.i.}$

2.1k Views Asked by At

Let $V$ be a real inner product space and $S=\{v_1,v_2, \dots, v_n\}\subset V$. How am I to prove that $S$ is linearly independent if and only if the determinant of the matrix $$ a_{ij}=\pmatrix{\langle v_i , v_j \rangle}$$ is nonzero?


Just to be clear, the matrix we're talking about is this one:

$$\pmatrix{\langle v_1,v_1\rangle & \langle v_1,v_2\rangle &\langle v_1,v_3\rangle & \cdots & \langle v_1,v_{n-1}\rangle & \langle v_1, v_n\rangle \\\langle v_2,v_1\rangle & \langle v_2,v_2\rangle &\langle v_2,v_3\rangle & \cdots & \langle v_2,v_{n-1} \rangle & \langle v_2,v_n\rangle \\\langle v_3,v_1\rangle & \langle v_3,v_2\rangle &\langle v_3,v_3\rangle & \cdots & \langle v_3,v_{n-1}\rangle & \langle v_3,v_n \rangle \\ \cdots&\cdots&\cdots&\cdots&\cdots&\cdots\\\langle v_{n-1}, v_1\rangle & \langle v_{n-1},v_2\rangle &\langle v_{n-1},v_3\rangle & \cdots & \langle v_{n-1},v_{n-1}\rangle & \langle v_{n-1},v_n\rangle \\\langle v_n,v_1\rangle & \langle v_n,v_2\rangle &\langle v_n,v_3\rangle & \cdots & \langle v_n,v_{n-1}\rangle & \langle v_n,v_n\rangle \\ }$$


I highly doubt that anybody here has Roman's Advanced Linear Algebra, or maybe you do, but I think on page $261$ there is a small note on something which looks similar.


Should anybody need the code (C++) in their research, here is a gadget which streams LaTeX code to a file named "matrix.txt" for an $n \times n$ matrix such as this with some value for $n$:

ofstream fout;
fout.open("matrix.txt");

int n;
cout << "Enter your desired n: ";
cin >> n;

fout << endl;

fout << "$\\begin{pmatrix}" << endl;

for( int j = 1 ; j <= n ; j++ )
{
    for( int i = 1 ; i<= n ; i++ )
    {
        if( j == i && i == n )
        {
            fout << "\\langle v_" << j << "," << "v_" << i << " \\rangle" << endl;
        }
        else
        {
            if( i == n )
            {
                fout << "\\langle v_" << j << "," << "v_" << i << " \\rangle\\\\" << endl;
            }
            else
            {
                fout << "\\langle v_" << j << "," << "v_" << i << " \\rangle&" << endl;
            }
        }
    }
    //fout << endl;
}

fout << "\\end{pmatrix}$" << endl << endl;

For n equal to 5 you get this result:

$\begin{pmatrix} \langle v_1,v_1 \rangle& \langle v_1,v_2 \rangle& \langle v_1,v_3 \rangle& \langle v_1,v_4 \rangle& \langle v_1,v_5 \rangle\\ \langle v_2,v_1 \rangle& \langle v_2,v_2 \rangle& \langle v_2,v_3 \rangle& \langle v_2,v_4 \rangle& \langle v_2,v_5 \rangle\\ \langle v_3,v_1 \rangle& \langle v_3,v_2 \rangle& \langle v_3,v_3 \rangle& \langle v_3,v_4 \rangle& \langle v_3,v_5 \rangle\\ \langle v_4,v_1 \rangle& \langle v_4,v_2 \rangle& \langle v_4,v_3 \rangle& \langle v_4,v_4 \rangle& \langle v_4,v_5 \rangle\\ \langle v_5,v_1 \rangle& \langle v_5,v_2 \rangle& \langle v_5,v_3 \rangle& \langle v_5,v_4 \rangle& \langle v_5,v_5 \rangle \end{pmatrix}$

As is lengthily explained here.

4

There are 4 best solutions below

2
On

Let $V=[v_1,v_2,\dots,v_n]$. Then $A=V^TV$. So

$$\det(A)=\det(V^TV)=\det(V^T)\det(V)=\det(V)^2.$$

The determinant of a square matrix is non-zero if and only if its columns form a linearly independent set of vectors.

2
On

Let $V=(v_1,\dots,v_n)$, then $ (a_{ij})=V^TV$, so $\det (a_{ij})=\det V^TV=(\det V)^2$.

2
On

Hints (for you to understand, complete and, eventually, prove):

Suppose we have $\,\{v_1,...,v_m\}\subset V\;,\;\;\text{with}\;\;m\le\dim V\;$, so we have that the Gramian is

$$G:=A^tA\;,\;\;\text{with}\;\;A=\left(v_1\;v_2\,\ldots\;v_m\right)=\text{ the matrix with columns $\,v_i\,$}$$

Note that $\,G\,$ is your matrix and this is an $\,m\times m\;$ square matrix , so:

$$(1)\;\;\;\exists\, 0\neq u\in V\;\;s.t.\;\;Gu=0\implies 0=u^tA^tAu=\langle\,(Au)^t\,,\,Au\,\rangle=\left\|Au\right\|\implies Au=0$$

and since $\,u\neq 0\,$ this means the rank of $\;a\;$ isn't full

(2) OTOH, if $\;\text{rk}(A)\;$ isn't full then $\,\exists\,0\neq u\in V\;\;s.t.\,\,Au=0\;$, so that

$$Gu=A^tAu=0\ldots\ldots$$

0
On

I'll give an answer that does not assume that $\def\R{\Bbb R}V=\R^n$ or even that $\dim V=n$, even though the essential argument boils down to that case anyway.

One direction is easy: if the $v_i$ satisfy a nontrivial linear dependence relation, then so do the corresponding rows of your matrix (and the columns as well) which forces the determinant to be zero.

In the other direction, suppose the $v_i$ are linearly independent. Now $W=\left<v_1,\ldots,v_n\right>\subseteq V$ is a subspace of dimension$~n$. Then the linear map $f:W\to\R^n$ given by $f(w)=(\left<v_1,w\right>,\ldots,\left<v_n,w\right>)$ is injective: a vector $w$ in the kernel of $f$ is othogonal to each $v_i$, hence by linearity to each vector of $W$, and this implies $w=0$ because (the restriction to$~W$ of) the bilinear form is non-degenerate. The image by the injective linear map $f$ of the linearly independent $n$-tuples of vectors $v_1,\ldots,v_n$ in$~W$ gives a linearly independent $n$-tuple of vectors in$~\R^n$, whose determinant is nonzero.

A less abstract way to finish off the second part is to choose an orthonormal basis of$~W$, and express the $v_i$ on this basis. The matrix of $f$ with respect to this basis has as row$~i$ the list of coordinates of$~v_i$ in the chosen basis, in other words it is the transpose of the matrix whose column$~j$ gives the coordinates of$~v_j$, and the two matrices have the same nonzero determinant$~d$. The matrix you are interested in the the product of those two transpose matrices, and its determinant is therefore $d^2\neq0$.