Minimize the error of distances by placing points

139 Views Asked by At

I have a set of 2d variables which I'm assigning, let's call them x[1], x[2], ..., x[n].

I also have a set of (n)(n-1)/2 known distances, let's call them d[1,2], d[1,3], ..., d[n-2,n-1], d[n-2,n], d[n-1,n].

My goal is to assign values to all x such that the sum of squared differences between ||x[i]-x[j]|| and d[i,j] is minimized. (This implies that for each i,j, I want the distance between x[i] and x[j] to closely approximate d[i,j].) If it's simpler I could force the squared distance between x[i] and x[j] to closely approximate d[i,j]^2.

Is there a name for this problem? I feel convinced it should be possible to formulate it more "simply" than writing out the sum ((x[1]-x[2])^2 - d[1,2]^2)^2 + ((x[1]-x[3])^2 - d[1,3]^2)^2... as an objective function and minimizing it using any generic optimization method. But thus far it's eluded me.

1

There are 1 best solutions below

0
On

It's called the Euclidean distance problem, a classical hard problem.