Finding the point that minimises the maximum divergence from a set of points

31 Views Asked by At

Consider a finite set of points $S \subseteq \mathbb{R}^n$. And consider a divergence $D : \mathbb{R}^n \times \mathbb{R}^n \rightarrow [0, \infty]$. I'm interested in finding the point that has minimal maximum distance from the points in $S$. That is, the point $z$ in $\mathbb{R}^n$ such that, if $z'$ is in $\mathbb{R}^n$, and $z' \neq z$, then $$\max_{x \in S} D(x, z) < \max_{x \in S} D(x, z')$$ I'm interested in the answer for any Bregman divergence $D$, but specifically squared Euclidean distance and Kullback-Leibler divergence. Are there analytic solutions to this? Or known algorithms for computing the point? Any help very much appreciated!