Is there any chance that the following problem is solvable (numerically, I do not hope for analytic solution) in a sense of being convex and getting to the global optimum.
I have a starting distance matrix $D_0$ as well as an adjacency matrix $A$ representing an undirected graph. Also there is a function mapping distances to probabilities:
$$ p_{ij} : \mathbb{R} \mapsto (0, 1) $$
defined as follows:
$$ p_{ij}(d_{ij}) = \frac{1}{1 + e^{d_{ij} - 1}} $$
(although in fact it can be any monotonic, sigmoidal function)
And I have the following log-likelihood function (objective function to maximize):
$$ \ell = \sum_{j < i} e_{ij}\log{}(p_{ij}) + (1-e_{ij})\log{}(1 - p_{ij}) $$
And the crux is that the distances $d_{ij}$ are the parameters to be optimized, but they have to remain distances.
So in summary this is a relatively simple Maximum Likelihood Estimation problem, but the optimization ranges not on $X = \mathbb{R}^{N(N-1)/2}$ but on a constrained set $Y \subset X$ consisting only of actual distance matrices.
EDIT.
A little bit of clarification. Distance matrices have to meet only one constraint. They have to be proper metrics, that is should have the following properties:
- symmetric ($d_{ij} = d_{ji}$)
- non-negative ($d_{ij} \geq 0$)
- triangle inequality: ($d_{ij} \leq d_{ik} + d_{kj}$)
So in fact they do not have to be Euclidean (I removed all mentions of Euclidean distance from the rest of the post).
So the goal is to find a distance matrix (with the above properties) that maximize the MLE objective.