Prove that Gram's determinant $G(x_1,\dots, x_n)=0$ if and only if $x_1, \dots, x_k$ are linearly dependent.
So I know that
$G(x_1,\dots, x_n)=\det \begin{vmatrix} \xi( x_1,x_1) & \xi( x_1,x_2) &\dots & \xi( x_1,x_n)\\ \xi( x_2,x_1) & \xi( x_2,x_2) &\dots & \xi( x_2,x_n)\\ \vdots&\vdots&\ddots&\vdots\\ \xi( x_n,x_1) & \xi( x_n,x_2) &\dots & \xi( x_n,x_n)\end{vmatrix}$
(where $\xi$ denotes an inner product) which means that if $G(x_1,\dots, x_n)=0$, then the determinant has to be equal to 0 as well. Why does it happen only with $x_1,\dots, x_k$ being linearly dependent, though?
Observe $G(X)=\det(X^TX)=0\iff \exists a\in{\bf R}^n:(X^TX)a=X^T(Xa)=0$. In expanded form:
$$\begin{array}{cr} & \begin{vmatrix} \langle x_1,x_1\rangle & \langle x_1,x_2\rangle & \dots & \langle x_1,x_n \rangle \\ \langle x_2,x_1 \rangle & \langle x_2,x_2 \rangle &\dots & \langle x_2,x_n \rangle \\ \vdots & \vdots & \ddots & \vdots \\ \langle x_n,x_1 \rangle & \langle x_n,x_2 \rangle & \dots & \langle x_n,x_n \rangle \end{vmatrix}=0 \\ \iff \exists a_1,\cdots,a_n: & a_1\begin{pmatrix}\langle x_1,x_1\rangle \\ \langle x_2,x_1\rangle \\ \vdots \\ \langle x_n,x_1\rangle\end{pmatrix} +\cdots+ a_n\begin{pmatrix}\langle x_1,x_n\rangle \\ \langle x_2,x_n\rangle \\ \vdots \\ \langle x_n,x_n\rangle\end{pmatrix}=0 \\ \iff \exists a_1,\cdots,a_n: & \begin{pmatrix}\langle x_1,a_1x_1+\cdots+a_nx_n\rangle \\ \langle x_2,a_1x_1+\cdots+a_nx_n\rangle \\ \vdots \\ \langle x_n,a_1x_1+\cdots+a_nx_n\rangle\end{pmatrix}=0 \end{array}$$
Now, $s=a_1x_1+\cdots+a_nx_n\in\langle x_1,\cdots,x_n\rangle=S$, and $s$ is orthogonal to all $n$ basis vectors $\{x_i\}$ of the space $S$ iff it is zero, and $0=s=a_1x_1+\cdots+a_nx_n$ means the $x_i$ are linearly dependent.