Let $(v_1,v_2,...)$ be linearly independent vectors in an inner product space $(V, \langle\cdot,\cdot\rangle)$.
Let $(e_1, e_2,...)$ be the gram schmidt orthonormalized version of $(v_1,v_2,...)$, so $e_1=\frac{v_1}{||v_1||}$, etc.
Let $G_n$ be the $n \times n$ Gram Matrix, so $(G_n)_{i,j}=\langle v_i,v_j\rangle$.
I want to show that $\frac{det(G_{n+1})}{det(G_{n})}=||v_{n+1}-\sum_{i=1}^{n}\langle v_{n+1},e_i\rangle e_i||^2$
I'm trying to use induction on The $n=1$ case is pretty easy to prove. The inductive hypothesis is that $\exists n \in \mathbb{N}$ such that
$\frac{det(G_{n+1})}{det(G_{n})}=||v_{n+1}-\sum_{i=1}^{n}\langle v_{n+1},e_i\rangle e_i||^2$, and from this I need to show that $$\frac{det(G_{n+2})}{det(G_{n+1})}=||v_{n+2}-\sum_{i=1}^{n+1}\langle v_{n+2},e_i\rangle e_i||^2$$
This is where I get stuck. Expanding the determinant seems to be way too messy and I wasn't able to get anywhere for it, even the $n=2$ case was hard to manage. I did manage to show that for $j<i$ we have
$$\langle v_i, v_j\rangle = \langle v_i, e_j\rangle||v_j - \sum_{k=1}^{j-1}\langle v_j,e_k\rangle e_k|| + \sum_{k=1}^{i-1}\langle v_i, e_k \rangle \langle e_k, v_i \rangle$$ And also that $$\langle v_i, v_i\rangle = ||v_i - \sum_{k=1}^{i-1}\langle v_i,e_k\rangle e_k||^2 + \sum_{k=1}^{i-1}\langle v_i, e_k \rangle \langle e_k, v_i \rangle$$
Finally after messing around a lot I think that $$det(G_n)=\prod_{i=1}^{n}(||v_i||^2-\sum_{j=1}^{i-1}\langle v_i, e_j \rangle \langle e_j, v_i \rangle)$$
If this is true, then the result I'm after becomes trivial, but I also haven't managed to get very far on proving this identity either.
Any help on this would be much appreciated, it doesn't seem like it should be this complicated! Also I can provide more details if needed. Thanks!
EDIT: I found a similar question here: Special Gram's inequality Which makes me think that this has something to do with the generalized distance, and suggests there might be a less brute force method for my problem.
HINT: You can check by induction that for each $m$, the span of the vectors $(v_1, v_2, \ldots, v_m)$ equals to the span of the vectors $(e_1, e_2, \ldots, e_m)$.
Since the vector $v'_{n+1} =v_{n+1}-\sum_{i=1}^{n}\langle v_{n+1},e_i\rangle e_i$ is perpendicular to all of the vectors $e_1, \ldots, e_n$, it is perpendicular to all of the vectors $v_1, \ldots, v_n$ .
Consider the Gram determinant $G(v_1,v_2, \ldots, v_n, v'_{n+1})$. It is easy to see that it equals the Gram determinant $G(v_1, \ldots, v_n, v_{n+1})= G_{n+1}$, since we have an equation of form $$v'_{n+1} = v_{n+1} + \sum_{m=1}^n a_m v_m$$
But using the orthogonality of $v'_{n+1}$ to $v_1$, $\ldots$, $v_n$ we get $$G(v_1,v_2, \ldots, v_n, v'_{n+1}) = G_n \cdot \|v'_{n+1}\|^2$$