Suppose there are $N$ objects of equal mass $1$ (for simplicity) at locations $(x_i)_{i=1}^N$, where $x_i$ is uniformly distributed in the square of side length $L$. The center of mass is given by: $$R=\dfrac{1}{N}\sum_i m_ix_i=\dfrac{1}{N}\sum_i x_i$$
Now, assume the same object locations, but now the masses are given by: $$m_i=\dfrac{\kappa}{\sum_{j=1}^N||x_i-x_j||}$$
where $\kappa$ is a constant to ensure that the new masses add up to $N$ (like the previous masses). This way of mass assignment gives more mass to objects that are close to each other (form clusters) as the $||x_i-x_j||$ will be smaller making the mass higher. The new arrangement has center of mass:
$$R'=\dfrac{1}{N}\sum_i m_ix_i=\dfrac{1}{N}\sum_i \dfrac{\kappa x_i}{\sum_{j=1}^N||x_i-x_j||}$$
Let $B(x,\delta)$ be a ball centered at $x$ with radius $\delta$. Let $X_n$ be any subset of $n$ elements formed by the objects $x_1,x_2,...,x_N$. Clearly, $1\leq n\leq N$ (yes, the empty set is excluded). Let $\alpha_n(x)=\inf_\delta \{\delta|X_n\subset B(x,\delta)\}$ ($X_n$ is always chosen as to minimize $\delta$). Then, I would like to show that $\alpha_n(R')\leq \alpha_n(R)$ for $n<N$ with high probability (over locations).
My thoughts so far
I thought about the simplest case and this is what I came up with so far.
Let $N=3$, $n=2$, and $x_i\in \mathbb{R}$. Label the points $x_1,x_2,x_3$ from left to right. Let $d_{1,2}=|x_1-x_2|$ and $d_{2,3}=|x_2-x_3|$. Without loss of generality assume that $d_{1,2}\geq d_{1,3}$ (if this is not the case, relabeling the points will fix it). Let us consider first the case when $d_{1,2}=d_{2,3}=d$. In that case, $R=x_2$ and $R'=x_2$ and $X_n=\{x_2,x_1\}$ or equivalently $X_n=\{x_2,x_3\}$. Moreover, writing out the equation for $R'$, we get:
$$R'=\dfrac{\kappa}{3}\left(\dfrac{x_2-d}{d_{1,2}+(d_{1,2}+d_{2,3})}+\dfrac{x_2}{d_{1,2}+d_{2,3}}+\dfrac{x_2+d}{(d_{1,2}+d_{2,3})+d_{2,3}}\right)=x_2$$
since $\kappa$ is the normalization constant.
Suppose $x_2'$ is $\epsilon$ nudge to the right of $x_2$ as to satisfy the condition $d_{1,2}\geq d_{1,3}$. After the nudge, we have that $R=x_2+\epsilon/3$, namely, $R$ is closer to $x_3$ than $x_1$, therefore $X_n=\{x_2,x_3\}$. For $R'$ we have the following:
$$R'=\dfrac{\kappa}{3}\left(\dfrac{x_2-d}{2(d_{1,2}+\epsilon)+d_{2,3}-\epsilon}+\dfrac{x_2+\epsilon}{d_{1,2}+d_{2,3}}+\dfrac{x_2+d}{d_{1,2}+\epsilon+2(d_{2,3}-\epsilon)}\right)$$ $$=\dfrac{1}{3}\left(\dfrac{\kappa(x_2-d)}{3d+\epsilon}+\dfrac{\kappa(x_2+\epsilon)}{2d}+\dfrac{\kappa(x_2+d)}{3d-\epsilon}\right)$$
Relative to the outer most terms, the term $\dfrac{x_2+\epsilon}{2d}$ is more influential because of the denominator being only $2d$ (whereas in $R$ all terms have the same denominator making them equally influential, which is why we get the $\epsilon/3$), therefore we will have that $R'\geq x_2+\epsilon/3=R$. Since $X_n=\{x_2,x_3\}$, and $R'$ is closer to $x_3$ than $R$, we get that $\alpha_2(R')\leq \alpha_2(R)$ with high probability. (In this case the probability is 1, which is covered by high probability).
Reflection and questions
I was able to prove the simplest case, but it felt convoluted and long-winded. I'd imagine there is a clever and more ellegant proof that allows to handle a more general case, e.g., $N>3$ and $x_i\in \mathbb{R}^d$.
PD. I skipped some details of my proof because I don't want to make the question too long and I think what I wrote conveys my ideas. However, if anyone finds helpful for me to include more details, I'll be happy to do so.