Problem. Let $A$ be a set of finite points in Euclidean space $\mathbb R^n$. Let $B$ be the ball containing $A$ with smallest radius. Prove the uniqueness of $B$.
The natural idea is two use contradiction argument, i.e. assume that there exist two smallest balls with same radius and distinct centers. But how do I deduce a contradiction from here? I spent the whole day to think about this but no progress. Thank you for any idea!
Comment. I found a paper of Mordukhovich et. al., but they used variational analysis in their arguments, which is too strong for such a "simple" problem, I think.
Suppose we have two smallest (closed) balls $B_r(a)$ and $B_r(b)$ containing the points. Then consider $c = \frac{a + b}{2}$.
Let us note that for every point $e$, we see that $d(c, e) \leq \max(d(a, e), d(b, e))$. In particular, when $a \neq b$, we have $d(c, e) < \max(d(a, e), d(b, e))$.
This is a basic fact of plane geometry (since $a, b, c$ are colinear, $a, b, c, e$ are coplanar).
Edit: let us elaborate on this. Since $a, b, c, e$ are coplanar and $a \neq b$, we give coordinates to the plane and WLOG set $a = (-1, 0)$, $b = (1, 0)$, $e = (x, y)$. Then $c = (0, 0)$. Now since $-1/2 < 1/2$ we either have that $x > -1/2$ or $x < 1/2$. WLOG, suppose $x < 1/2$. Then $d(c, e)^2 = x^2 + y^2$ and $d(b, e)^2 = (1 - x)^2 + y^2 = x^2 + (1 - 2x) + y^2$. Since $x < 1/2$, we have $1 - 2x > 0$. So clearly, $d(c, e)^2 < d(b, e)^2$ and thus $d(c, e) < d(b, e)$.
It follows that for every point $x \in A$, we have $d(c, x) < r$. Therefore, we have $r' := \max\limits_{x \in A} d(c, x) < r$ since $A$ is finite. So we can form a smaller ball $B_{r'}(c)$ which contains all the points in $A$.