I have n points in Euclidean space $\{\mathbf{a_1}, \mathbf{a_2}, ... , \mathbf{a_n}\}$, and the desired distances to them $\{d_1, d_2, ..., d_n\}$. How can I find the optimal point $\mathbf{x}$ that minimizes $\sum_{i=1}^n [\lVert \mathbf{x} - \mathbf{a}_i \rVert - d_i]^2$
What I am doing now is put x somewhere, and differentiate the objective. However, gradient descent very often end up in local minima.
Say x $ = (x, y)$ and each $\mathbf{a_i} = (x_i, y_i)$. This means that we want to minimize $$\sum_{i=1}^{n} \left(\sqrt{(x-x_i)^2+(y-y_i)^2}-d_i\right)^2$$
Expanding this and getting rid of "+c" terms such as $d_i^2$ yields $$n(x^2+y^2)-2x\sum_{i=1}^{n}x_i -2y\sum_{i=1}^{n}y_i -2\sum_{i=1}^{n} d_i\sqrt{(x-x_i)^2+(y-y_i)^2}$$
Taking the partial derivative with respect to $x$ and setting equal to $0$ finds $$nx-\sum_{i=1}^{n}x_i - \sum_{i=1}^{n} d_i\frac{x-x_i}{\sqrt{(x-x_i)^2+(y-y_i)^2}} = 0$$
Taking the partial derivative with respect to $y$ and setting equal to $0$ is the same thing, but with $x$ and $y$ flipped. $$ny-\sum_{i=1}^{n}y_i - \sum_{i=1}^{n} d_i\frac{y-y_i}{\sqrt{(x-x_i)^2+(y-y_i)^2}} = 0$$
Both of these equalities must be satisfied to get the optimal point. If the specific $\mathbf{a_i, d_i}$ were known, $x, y$ could be approximated using Mathematica or some other software/website. However, I do not think any closed form will exist for the optimal point for $n \ge 3$.