Prove that $$\begin{vmatrix} 1 & \cos{x_{0}} & \sin{x_{0}} & \dots & \cos{nx_{0}} & \sin{nx_{0}} \\ 1 & \cos{x_{1}} & \sin{x_{1}} & \dots & \cos{nx_{1}} & \sin{nx_{1}}\\ \dots & \dots & \dots & \dots & \dots & \dots \\ 1 & \cos{x_{2n}} & \sin{x_{2n}} & \dots & \cos{nx_{2n}} & \sin{nx_{2n}} \notag \end{vmatrix} = 2^{n^2} \prod_{0 \leq p < q \leq 2n}\sin{\frac{x_{q}-x_{p}}{2}}.$$
This is simply stated (unproven and unexplained) in my numerical analysis textbook, in the trigonometric interpolation subsection (this is the determinant of the system of linear equations one gets when one approximates $f$ with $$Q_{n}(x) = a_{0} + \sum_{k=1}^{n}(a_{k}\cos{kx} + b_{k}\sin{kx}),$$ with the condition that $Q_{n}(x_{i}) = f(x_{i}), i \in \{0, 1, ..., 2n\}.$ I'm trying to find a proof via induction, but I suspect other solutions may be possible, although any solution is welcome. Also, I'm not 100% sure about the coefficient $2^{n^2}$ because if I expand the determinant for $n=1$ I get $4$ instead of $2$ in front of the product of sines.
First of all, your formula should be:
$$2^{\color{red}{2}n^2} \prod_{0 \leq p < q \leq 2n}\sin{\frac{x_{q}-x_{p}}{2}}.$$
Proof. Let us call:
$$\tag{1}A:=\begin{pmatrix} 1 & \cos{x_{0}} & \sin{x_{0}} & \dots & \cos{nx_{0}} & \sin{nx_{0}} \\ 1 & \cos{x_{1}} & \sin{x_{1}} & \dots & \cos{nx_{1}} & \sin{nx_{1}}\\ \dots & \dots & \dots & \dots & \dots & \dots \\ 1 & \cos{x_{2n}} & \sin{x_{2n}} & \dots & \cos{nx_{2n}} & \sin{nx_{2n}} \notag \end{pmatrix}.$$
Post-multiplication by block-diagonal matrix:
$$\tag{2}B=diag\left(1, \binom{\ \ \ 1 \ \ 1}{-i \ \ i},\binom{\ \ \ 1 \ \ 1}{-i \ \ i},\cdots \binom{\ \ \ 1 \ \ 1}{-i \ \ i}\right)$$
(the $i$ of complex numbers, with $det(B)=(2i)^n$) gives matrix:
$$\tag{3}AB=\begin{pmatrix} 1 & e^{-ix_{0}} & e^{ix_{0}} & \dots & e^{-inx_{0}} & e^{inx_{0}} \\ 1 & e^{-ix_{1}} & e^{ix_{1}} & \dots & e^{-inx_{1}} & e^{inx_{1}} \\ \dots & \dots & \dots & \dots & \dots & \dots\\ 1 & e^{-ix_{2n}} & e^{ix_{2n}} & \dots & e^{-inx_{2n}} & e^{inx_{2n}} \end{pmatrix} .$$
Now, we pre-multiply $AB$ by diagonal matrix
$$\tag{4}\Delta=diag(e^{inx_{0}},e^{inx_{1}},\cdots e^{inx_{2n}})$$
(with $\det(\Delta)=e^{i s}$ where $s$ is the sum of the $x_k$s) yields a Vandermonde matrix which has a neat product expression (https://proofwiki.org/wiki/Vandermonde_Determinant). In fact, this matrix has to be column-shuffled by a certain permutation ; this will be done by post-multiplication by a certain "permutation matrix" $P$ which "re-winds" the columns in the right order
$$\tag{5}P=\begin{pmatrix} 0 & 0 & \dots & 1 & 0 & \dots & 0 \\ 0 & 0 & \dots & 0 & 1 & \dots & 0 \\ \dots & \dots & \dots & \dots & \dots & \dots & \dots \\ 0 & 0 & \dots & 0 & \dots & 0 & 1 \\ 1 & 0 & \dots & 0 & \dots & 0 & 0 \notag \end{pmatrix}.$$
(the important thing is that the determinant of such a matrix is $\pm1$).
To conclude, it suffices to take the determinant of the matrix product
$$\tag{6}V=((\Delta AB)P)$$
($V$ as Vandermonde) and apply the nice Vandermonde determinant formula (product of all differences, taken always in a pre-assigned order) (see remark below):
$$\tag{7}e^{i\alpha}2^{2n^2+n} \prod_{0 \leq p < q \leq 2n}\sin{\frac{x_{q}-x_{p}}{2}}=\underbrace{2^n i^n}_{det(\Delta)} \underbrace{e^{i s}}_{det(B)} \times det(A) \times\underbrace{\pm 1}_{det(P)}$$
(for a certain $\alpha$) from which we are able to obtain $\det(A)$, up to a term in $e^{i\beta}$ (see remarks below).
Remark 1:
$$\tag{7}e^{ix_p}-e^{ix_q}=\underbrace{(e^{i \tfrac{x_p-x_q}{2}}-e^{-i \tfrac{x_p-x_q}{2}})}_{2i \sin \tfrac{x_p-x_q}{2}}e^{i \tfrac{x_p+x_q}{2}}.$$
Remark 2 : We have obtained the desired formula up to a multiplication by a complex number $e^{i \alpha}$, but, as the result has to be a real number, the desired formula is obtained only up to a sign $\pm1$.
Remark 3 : Relationship (7) generates a coefficient $(2i)^{n(2n+1)}$ because there are $\tfrac{(2n+1)2n}{2}$ unordered pairs of $(x_p,x_q)$ for the set with $(2n+1)$ elements $\{x_0,x_1,\cdots x_{2n}\}.$