Optimal Configuration for a Set of Points

198 Views Asked by At

Consider a set of $n$ points on the plane with positions $\mathbf{p}_1,\dots,\mathbf{p}_n$, such that each point $i$ has at least one neighbor $j$ at a distance of no more than $\lambda$ away from it (i.e. $||\mathbf{p}_i - \mathbf{p}_j||\leq \lambda$).

The question is: how do you choose the positions of the points in order to maximize their second moment, defined as:

$ U = \sum \limits_{i} ||\mathbf{p}_i - \bar{\mathbf{p}}||^2, $

where $\bar{\mathbf{p}}=\sum \limits_{i} \mathbf{p}_i$ is the barycenter of the points.

Intuitively, I think that the points should be placed along a straight line spaced $\lambda$ from each other, but I am not sure how to (dis)prove this.

Thanks for any help.

2

There are 2 best solutions below

1
On BEST ANSWER

[From the comments I posted previously.]

The maximum of U is $+\infty$

For positive even $n\ge 2$, we can just choose $n/2$ points on the left/right side, respectively:

$$\mathbf{p}_i=(−x+\lambda i,0) \quad \forall 1\le i \le n/2$$ $$\mathbf{p}_i=(x-\lambda i,0) \quad \forall n/2+1\le i \le n$$

Then every point has at least one neighbor whose distance is $\le\lambda$ and their mean is zero. However, $U \rightarrow+\infty$ when $x\rightarrow +\infty$.

For positive odd $n\ge 5$, from @triple_sec: let $m=(n+1)/2$ and

$$\mathbf{p}_i=(x+(i−1)\lambda,0)$$ for $i\in\{1,\ldots,m\}$ and

$$\mathbf{p}_i=(−y−(i−m−1)\lambda,0)$$ for $i\in\{m+1,…,n\}$, where $y=\lambda+\frac{n+1}{n−1}x$. Then, $\sum_{i=1}^n \mathbf{p}_i=0$, whereas the objective function can be shown to be

$$\frac{n(n+1)}{n−1}\left(x^2+\frac{n−1}{2}\lambda x + \frac{(n−1)^2}{12}\lambda^2\right)$$

which diverges to infinity as $x\rightarrow +\infty$.

1
On

Here is a rough idea. Since the objective and the constraint are invariant to simultaneous translation and rotation of all points, you can assume wlog that $\overline{\mathbf{p}}=\mathbf{0}$ and that $\mathbf{p}_n=(x_n,0)$, for some $x_n\in\mathbb{R}$. Using these constraints, you can express the coordinates of the points as follows: $$\mathbf{p}_1=\left(\begin{array}{c}x_1\\y_1\end{array}\right)\quad\cdots\quad\mathbf{p}_{n-2}=\left(\begin{array}{c}x_{n-2}\\y_{n-2}\end{array}\right)\quad\mathbf{p}_{n-1}=\left(\begin{array}{c}x_{n-1}\\-(y_1+\cdots+y_{n-2})\end{array}\right)\quad\mathbf{p}_{n}=\left(\begin{array}{c}-(x_1+\cdots+x_{n-1})\\0\end{array}\right).$$

In simplifying the last two vectors, I exploited the requirement that the points' barycenter be the origin. Now, the problem reduces to choosing $((x_i)_{i=1}^{n-1},(y_i)_{i=1}^{n-2})$, that is, there are $2n-3$ free variables. The objective is to maximize the sum of squares of these numbers: $$\sum_{i=1}^{n-2}(x_i^2+y_i^2)+x_{n-1}^2+(y_1+\cdots+y_{n-2})^2+(x_1+\cdots+x_{n-1})^2$$

subject to the constraint that for each $i\in\{1,\ldots,n\}$, there must exist some $j\in\{1,\ldots,n\}$, $j\neq i$, such that $(x_i-x_j)^2+(y_i-y_j)^2\leq\lambda^2$. Here, $y_{n-1}=-(y_1+\cdots+y_{n-2})$, $x_n=-(x_1+\cdots+x_{n-1})$, and $y_n=0$.

Again wlog, you can assume that $\mathbf{p}_1$ is $\lambda$-close to $\mathbf{p}_2$: $$(x_1-x_2)^2+(y_1-y_2)^2\leq\lambda^2.$$ This will be one of the constraints. Therefore, you need not worry about finding close points for $\mathbf{p}_1$ and $\mathbf{p}_2$ anymore. The real headache occurs when you think about how to find near neighbors for the other points. There are at most $(n-2)n$ ways to find near neighbors for the points $\{\mathbf{p}_k\}_{k=3}^n$ ($n$ possible near neighbors for $n-2$ points). For each of these configurations, you can write the set of constraints imposing $\lambda$-closeness explicitly and solve the resulting constrained maximization problem. Then, out of the $n(n-2)$ possible constrained maximization problems, pick out the one that yields the greatest value for the objective. This could be done using a computer algorithm, either (i) solving the $n(n-2)$ Kuhn–Tucker problems symbolically; or (ii) for a given value of $\lambda$, solve the $n(n-2)$ problems numerically to see whether your conjecture that the points must lie on a straight line is correct (I lean to believing it is).

I'm pretty sure there exists a more efficient and smart algorithm, but this is what I could come up with so far. It seems to be a difficult problem (or I'm not proficient enough at convex optimization—the two possibilities are very likely not to be mutually exclusive).