Cayley–Menger determinant.

960 Views Asked by At

Consider $n$-simplex. I read that Cayley–Menger determinant is a squared-volume of simplex. But I don't know how to get it.

Actually after some attempts I get this : $\displaystyle V_s^2 = \frac 1 {(m!)^2}G(v_1\ldots v_n)$, where $G(v_1\dots v_n)$ is Gramian. But how can we transform Gramian to Cayley–Menger matrix? Any ideas?

1

There are 1 best solutions below

0
On BEST ANSWER

What you mentioned in your question are a bit inaccurate. Strictly speaking, suppose

  • $x_0,x_1,\ldots,x_m$ are $m+1$ points in $\mathbb R^n$, where $m\le n$;
  • $X\in M_{n,m+1}(\mathbb R)$ is the augmented matrix $[x_0|x_1|\cdots|x_m]$;
  • $Y\in M_{n,m}(\mathbb R)$ is the augmented matrix $[x_1-x_0|\cdots|x_m-x_0]$;
  • $C\in M_{m+2}(\mathbb R)$ is the Cayley-Menger matrix $\pmatrix{0&e^T\\ e&B}$, where $B\in M_{m+1}(\mathbb R)$ is the squared distance matrix defined by $b_{ij}=\|x_i-x_j\|^2$ (in zero-based indexing) and $e\in\mathbb R^{m+1}$ is a vector of ones;
  • $V$ is the volume of the $m$-simplex with vertices $x_0,x_1,\ldots,x_m$ on the $m$-dimensional affine plane they lie on.

Then $$ V^2 = \frac1{(m!)^2}\det(Y^TY) = \frac{-1}{(-2)^m (m!)^2} \det(C).\tag{1} $$ In other words, the Cayley-Menger determinant is related to the Gram determinant as follows: $$ \det(C) = -(-2)^m \det(Y^TY).\tag{2} $$ Note that the Gramian here is not the Gramian of the data matrix $X$, but the Gramian of the reduced data matrix $Y$. Conceptually, if you want to calculate the volume by using Gramian, you should adopt a coordinate system in which a vertex of the simplex is located at the origin.

We may derive $(2)$ directly. Let $d=(\|x_0\|^2,\ldots,\|x_m\|^2)^T\in\mathbb R^{m+1}$ be the diagonal of $X^TX$. Since $b_{ij}=\|x_i\|^2-2\langle x_i,x_j\rangle+\|x_j\|^2$, we have $$ C=\pmatrix{0&e^T\\ e&de^T+ed^T-2X^TX}.\tag{3} $$ Subtract $d$ times the first row from other rows, and $d$ times the first column from other columns, we get $$ \det(C)=\det\pmatrix{0&e^T\\ e&-2X^TX}.\tag{4} $$ Now, if we subtract the second column from the third, the fourth up to the last column, and do the similar for the rows, we obtain $$ \det(C)=\det\underbrace{\pmatrix{0&e_1^T\\ e_1&-2[x_0|Y]^T[x_0|Y]}}_{\widetilde{C}},\tag{5} $$ where $e_1=(1,0,\ldots,0)^T$. Since the last $m$ entries on the first row of $\widetilde{C}$ are zero, if $Y$ has linearly dependent columns, then the last $m$ columns of $\widetilde{C}$ are also linearly dependent and hence $(2)$ holds. If $Y$ has full column rank instead, then by multiplying $\widetilde{C}$ by $$ P=\pmatrix{1&0&0\\ 0&1&0\\ 0&-(Y^TY)^{-1}Y^Tx_0&I_m} $$ on the right, and also by $P^T$ on the left, we may obtain $(2)$: $$ \det(C)=\det(\widetilde{C})=\det(P^T\widetilde{C}P)=\det\pmatrix{0&1&0\\ 1&\ast&0\\ 0&0&-2Y^TY}=-(-2)^m\det(Y^TY). $$