Given a sequence of vectors $(a_1, a_2, ...)$ with $(\forall i \in \{ 1, 2, ... \} ) a_i \in \mathbb{R}^n$, is the following equation true, and furthermore is it possible to formally prove that it is true?
$$\sum_{\sigma\in S_{j-1}}{\left[\mathrm{sgn}(\sigma)\prod_{l=1}^{j-1}{\left[ a_l^T a_{\sigma(l))} \right ]} \left( a_j^Ta_j - \sum_{k=1}^{j-1}{\left[ \frac{\left( a_j^T a_k \right ) \left(a_{\sigma(k)}^T a_j \right )}{a_k^T a_{\sigma(k)} } \right ]} \right ) \right ]} = \sum_{\sigma\in S_{j}}{\left[\mathrm{sgn}(\sigma)\prod_{l=1}^{j}{\left[ a_l^T a_{\sigma(l))} \right ]} \right ]}$$
For example, with $j=3$:
$$ \begin{aligned} &\sum_{\sigma\in S_{2}}{\left[\mathrm{sgn}(\sigma)\prod_{l=1}^{2}{\left[ a_l^T a_{\sigma(l))} \right ]} \left( a_3^T a_3 - \sum_{k=1}^{2}{\left[ \frac{\left( a_3^T a_k \right ) \left(a_{\sigma(k)}^T a_3 \right )}{a_k^T a_{\sigma(k)} } \right ]} \right ) \right ]} \\\\ &=+(a_1^T a_1)(a_2^T a_2) \left( a_3^T a_3 - \frac{( a_3^T a_1 ) (a_1^T a_3 )}{a_1^T a_1 } - \frac{( a_3^T a_2 ) (a_2^T a_3 )}{a_2^T a_2 } \right ) - (a_1^T a_2)(a_2^T a_1) \left( a_3^T a_3 - \frac{( a_3^T a_1 ) (a_2^T a_3 )}{a_1^T a_2 } - \frac{( a_3^T a_2 ) (a_1^T a_3 )}{a_2^T a_1 } \right ) \\\\ &= + (a_1^T a_1) (a_2^T a_2) (a_3^T a_3) - (a_1^T a_3) (a_2^T a_2) (a_3^T a_1) - (a_1^T a_1) (a_2^T a_3) (a_3^T a_2) - (a_1^T a_2) (a_2^T a_1) (a_3^T a_3) + (a_1^T a_3) (a_2^T a_1) (a_3^T a_2) + (a_1^T a_2) (a_2^T a_3) (a_3^T a_1) \\\\ &=\sum_{\sigma\in S_{3}}{\left[\mathrm{sgn}(\sigma)\prod_{l=1}^{3}{\left[ a_l^T a_{\sigma(l))} \right ]} \right ]} \end{aligned} $$
It seems intuitively plausible to me that this equation is true (indeed, it's clear that there are the correct number of terms on each side of the equation, and the denominators all cancel out), although I'm not sure how to prove it.
EDIT: I'm looking for answers that don't involve matrix determinants. For an explanation of this, and the general motivation for this question, see the comments below on this question.
EDIT 2: Note to self: a useful approach might be to start with the RHS and write the sum $\sum_{\sigma\in S_j}$ as two sums $\sum_{\sigma\in S_j: \sigma(j) = j}$ and $\sum_{\sigma\in S_j: \sigma(j) \ne j}$, factorise terms including $a_j$ outside of the product of inner products, write $\sum_{\sigma\in S_j: \sigma(j) \ne j}$ as $\sum_{k=1}^{j-1}\sum_{\sigma\in S_j: \sigma(j) = k}$, simplify and then rewrite in terms of $S_{j-1}$, accounting for the change in sign of the permutations which don't fix $j$.
Or, slightly simpler, write $\sum_{\sigma\in S_j}$ as $\sum_{k=1}^{j}\sum_{\sigma\in S_j: \sigma(j) = k}$ - later the $\sum_{k=1}^{j}$ can be split up and an $a_j^T a_j$ term will hopefully cancel in the numerator and denominator.
The RHS is the determinant of the Gram matrix $\det(a_i^T a_j)$. The LHS should be obtained from the RHS via a cofactor expansion along either the last row or the last column, although this is somewhat disguised by how you've written it.