Consider a set of $n$ points on the plane with positions $\mathbf{p}_1,\dots,\mathbf{p}_n$, such that each point $i$ has at least one neighbor $j$ at a distance of no more than $\lambda$ away from it (i.e. $||\mathbf{p}_i - \mathbf{p}_j||\leq \lambda$).
The question is: how do you choose the positions of the points in order to maximize their second moment, defined as:
$ U = \sum \limits_{i} ||\mathbf{p}_i - \bar{\mathbf{p}}||^2, $
where $\bar{\mathbf{p}}=\sum \limits_{i} \mathbf{p}_i$ is the barycenter of the points.
Intuitively, I think that the points should be placed along a straight line spaced $\lambda$ from each other, but I am not sure how to (dis)prove this.
Thanks for any help.
[From the comments I posted previously.]
The maximum of U is $+\infty$
For positive even $n\ge 2$, we can just choose $n/2$ points on the left/right side, respectively:
$$\mathbf{p}_i=(−x+\lambda i,0) \quad \forall 1\le i \le n/2$$ $$\mathbf{p}_i=(x-\lambda i,0) \quad \forall n/2+1\le i \le n$$
Then every point has at least one neighbor whose distance is $\le\lambda$ and their mean is zero. However, $U \rightarrow+\infty$ when $x\rightarrow +\infty$.
For positive odd $n\ge 5$, from @triple_sec: let $m=(n+1)/2$ and
$$\mathbf{p}_i=(x+(i−1)\lambda,0)$$ for $i\in\{1,\ldots,m\}$ and
$$\mathbf{p}_i=(−y−(i−m−1)\lambda,0)$$ for $i\in\{m+1,…,n\}$, where $y=\lambda+\frac{n+1}{n−1}x$. Then, $\sum_{i=1}^n \mathbf{p}_i=0$, whereas the objective function can be shown to be
$$\frac{n(n+1)}{n−1}\left(x^2+\frac{n−1}{2}\lambda x + \frac{(n−1)^2}{12}\lambda^2\right)$$
which diverges to infinity as $x\rightarrow +\infty$.