Inner product space, prove $\det(A) \geq 0$ given a Gram matrix $A$

1k Views Asked by At

Let $K=\mathbb R$ and let $V$ be a $\mathbb K-$finite dimensional inner product space, $\dim(V)=n$. Consider $v_1,\ldots,v_m \in V$ with $1 \leq m \leq n$. Let $A \in\Bbb K^{m\times m}$ defined as $A=(a_{ij})$ with $a_{ij}=\langle v_i,v_j \rangle$. Prove the following:

1) $\det(A) \geq 0$

2) $\det(A)>0$ if and only if $v_1,\ldots,v_m$ are linearly independent

What I thought of was to consider $S=\langle v_1,\ldots,v_m \rangle$ and $\mathcal B_S=\{s_1,\ldots,s_k\}$ ($k \leq m$) an orthonormal basis of the subspace generated by these vectors.

Each $v_i$ can be written as $$v_i=\sum_{t=1}^kb_{ti}s_t$$

The determinant of $A$ is defined as $$\det(A)=\sum_{\sigma \in S_n} \operatorname{sg}(\sigma)a_{\sigma(1)1}\cdots a_{\sigma(n)n}$$$$=\sum_{\sigma \in S_n} \operatorname{sg}(\sigma)\langle v_{\sigma(1)},v_1 \rangle...\langle v_{\sigma(n)},v_n \rangle$$$$=\sum_{\sigma \in S_n} \operatorname{sg}(\sigma)\left (\sum_{l_1=1}^kb_{{l_1}\sigma(1)}b_{{l_1}1}\right)\cdots\left (\sum_{l_n=1}^kb_{{l_n}\sigma(n)}b_{{l_n}n} \right ).$$

I don't understand why this last expression should be greater than or equal to $0$, I would appreciate suggestions on how to continue the exercise.

2

There are 2 best solutions below

0
On BEST ANSWER

HINT:

With OP's ideea: Consider a subspace $W$ of dimension exactly $m$ containing your $m$ vectors and take $u_1$, $\ldots$, $u_m$ an orthonormal basis of $W$. (Now you are like in the case of $m$ vectors in $\mathbb{R}^m$). Express $v_i$ in terms of the $u_i$ using a square $m\times m$ matrix $M$:

$$(v_1, \ldots, v_m) = (u_1, \ldots, u_m) \cdot M$$

Then the matrix $(\langle v_i, v_j\rangle)$ equals $$\langle (v_1, \ldots, v_m)^t, (v_1, \ldots, v_m) \rangle = M^t \cdot (\langle u_i, u_j \rangle) \cdot M= M^t \cdot M$$ since the matrix $(\langle u_i, u_j \rangle)$ is the identity, $(u_i)$ being an orthonormal system. Therefore $$\det (\langle v_i, v_j\rangle) = \det M^2 \ge 0$$ with equality if and only if $M$ is singular, that is, the vectors $v_i$ dependent.

Obs: The solution looks deceivingly simple, apart from the multiplicativity of determinants. Behind it stands the Gram-Schmidt orthogonalization process, which is in fact intimately connected with these kind of determinants (also called Gram determinants), and can be made explicit using Gram matrices.

0
On

Hints:

Show that $A$ is a positive-semidefinite matrix.

What can you say about the eigenvalues of a positive-semidefinite matrix?

What's the relation between the eigenvalues of a matrix and its determinant?

Can you show that $0$ is not an eigenvalue of $A$ iff $v_1, \dots v_n$ are linearly independent? If $0$ is not an eigenvalue, is the matrix still just positive semidefinite?