I am trying to show:
$\det(S) = \frac{S\vec{u}\cdot(S\vec{v} \times S\vec{w})}{\vec{u}\cdot(\vec{v} \times \vec{w})}$
for $\vec{u},\vec{v},\vec{w}$ linearly independent from some finite dimensional vector space and $S$ such that its determinant is non-zero (I am not sure if I need this).
I know that if I denote: $P\equiv \begin{align} \begin{pmatrix} \vec{u} \\ \vec{v} \\ \vec{w} \end{pmatrix} \end{align}$
Then:
$\det(P)=\vec{u}\cdot(\vec{v}\times\vec{w})$
Similarly for:
$P_S\equiv \begin{align} \begin{pmatrix} S\vec{u} \\ S\vec{v} \\ S\vec{w} \end{pmatrix} \end{align}$
I have:
$\det(P_S)=S\vec{u}\cdot(S\vec{v}\times S\vec{w})$
and since
$SP^T = S (\vec{u},\vec{v},\vec{w}) = \begin{align} \begin{pmatrix} S\vec{u} \\ S\vec{v} \\ S\vec{w} \end{pmatrix} \end{align}^T = P_S^T$
I have
$\det S P^T = \det S \det P = \det P_S^T = \det P_S$
I get:
$\det S = \frac{\det P_S}{\det{P}}$,
showing what I need. Is there a simpler or more elegant proof that I am missing?
Let $V$ be a $K$-vector space. It can be proven that a map $\phi : V^N \to K$ that is multilinear in every argument and completely antisymmetric is proportional to the determinant of the matrix constructed with the arguments of $\phi$ as columns. That means that if you show that \begin{equation} \phi(S^1,S^2,S^3) = \frac{S\vec{u}\cdot(S\vec{v} \times S\vec{w})}{\vec{u}\cdot(\vec{v} \times \vec{w})} \end{equation} is multilinear and completely antisimmetric, you have $\phi(S^1,S^2,S^3) = \lambda \det(S)$. To determine $\lambda$ you can take $S$ equal to the identity matrix, the fraction simplifies giving $\lambda=1$. I want to stress the fact that this result is valid only for $\dim V = 3$.