Almost negative definite matrices and norm-distance matrices

1k Views Asked by At

An "almost negative definite" matrix $A$ satisfies the property $$ v^te = 0\implies v^tAv\le 0 $$ where $e=(1,1,\dots,1)$.

We know that if $A$ is a simmetric zero-diagonal (hollow) matrix, then $A$ is almost negative definite if and only if it is an EDM.

In my experiments, I couldn't find a counterexample to the statement

if $A$ is a distance matrix of $n$ point in $\mathbb R^s$, with the distance induced by a norm $d(x,y)=\|x-y\|$, then $A$ is always almost negative definite.

but I still don't know if it's true or not. It would imply that $A$ is an EDM, and $\sqrt{d(x,y)}$ satisfy the triangular property, but I still can't see why it should be the euclidean distance between $n$ points.

1

There are 1 best solutions below

1
On BEST ANSWER

The statement is true. More generally, for every $\alpha\in (0,1)$ the metric space $(\mathbb{R}^n, \|x-y\|^\alpha)$ can be isometrically embedded into a Hilbert space (you only need $\alpha=1/2$ here). This is Corollary 4.8 in the book Embeddings and extensions in analysis by Wells and Williams, who attribute this result to John von Neumann.

The preparatory results take a couple of pages, so I only sketch the proof here. First, some definitions.

  1. A metric $d$ is of negative type if the matrix $(d(x_j,x_j)^2)$ is almost negative definite for any finite sequence of points $(x_i)$ in the metric space.
  2. A scalar function $f$ on a vector space is positive definite if the matrix $f(x_i-x_j)$ is positive semidefinite for any finite sequence of points $(x_i)$ in the vector space.

Bochner's theorem characterizes positive definite functions on $\mathbb{R}^n$ as Fourier transforms of positive measures. In particular, the Gaussian $f(x)=\exp(-\lambda^2 \|x\|^2)$ is positive definite on a Hilbert space, being the Fourier transform of another Gaussian.

A connection between negative type and positive definiteness is as follows: $d$ is of negative type iff for every $\lambda>0$ the matrix $(\exp(-\lambda d(x_j,x_j))$ is positive semidefinite for any finite sequence of points $(x_i)$ in the metric space. Both of these are also equivalent to being isometrically embeddable into a Hilbert space.

Finally, the identity $$ |t|^{2\alpha} =c_\alpha \int_0^\infty (1-\exp(-\lambda^2 t^2))\lambda^{-1-2\alpha}\,d\lambda $$ is used to show that the matrix $(d(x_j,x_j)^{2\alpha} )$ is almost negative definite, provided that $d$ is of negative type. A key point is that the property of being almost negative definite is preserved under addition.