Best approximation of an arbitrary matrix by a distance matrix

169 Views Asked by At

A distance matrix $D$ is a $n\times n$ matrix obtained by computing the pair-wise distances $D_{ij} = \text{dist}(x_i, x_j)$ between a collection of points $\{x_1,\ldots, x_n\}$ in some metric space. Due to the metric-axioms, the matrix $D$ satisfies the properties:

  1. $D_{ii} = 0$ for all $i$
  2. $D_{ij} > 0$ if $i\neq j$
  3. $D_{ij} \le D_{ik}+D_{kj}$ for all $k$ (triangle inequality)

Let $\mathfrak D$ be the set of all $n\times n$ distance matrices. Given an arbitrary matrix $\widetilde{D} $, I want to find

$$ \min_{D \in \mathfrak D} \|\widetilde{D} -D\|_\text{F} $$

Are there any results similar to the Eckart–Young–Mirsky theorem for this problem? And what if we assume that $\widetilde D$ already satisfies (1) and (2)?

2

There are 2 best solutions below

2
On

Rephrasing a bit and relaxing the positivity constraints, given a matrix $\mathrm M \in \mathbb R^{n \times n}$, we have the following (convex) quadratic program (QP) in symmetric matrix $\mathrm X \in \mbox{Sym}_n (\mathbb R)$

$$\begin{array}{ll} \text{minimize} & \| \mathrm X - \mathrm M \|_{\text{F}}^2\\ \text{subject to} & \mathrm A \,\mbox{vec} (\mathrm X) \leq \mathrm b\\ & x_{ii} = 0\\ & x_{ij} \geq 0\end{array}$$

where $\mathrm A \,\mbox{vec} (\mathrm X) \leq \mathrm b$ encapsulates all the triangle inequalities. Note that the original positivity constraints have been replaced by non-negativity constraints.

0
On

I found an intriguing reference:

Convex Optimization Euclidean Distance Geometry 2e by Dattorro

In particular Section 7.0.4 mentions this problem.