Find Algorithm, Given a List of Arcs, which Maximizes the Number that Fit on a Circle

533 Views Asked by At

I'm trying to find an optimal algorithm that, given a list of arcs $(x_i, y_i)$, where $x_i$ and $y_i$ are the starting and ending angle measurements of the arc in radians, maximizes the number of arcs that can be fit onto a circle.

I'm also curious as to how one might solve the problem of maximizing profit if each arc also had an associated price.

A dynamic programming approach seems relevant, but I'm not sure how to implement it.

2

There are 2 best solutions below

2
On

For a simple implementation, I would suggest a variation on the increasing first fit algorithm (this is a fairly standard packing algorithm).

Rank each arc in order of size starting from the smallest to the largest. Size $= y_i-x_i$. The rationale here is that to fit as many as possible, you want the arcs to be as small as possible. If there is an associated profit then rank them by increasing value of $\frac{size}{profit}$.

Starting with the smallest arc (or the smallest $\frac{size}{profit}$ arc), add the arc to your circle. Only allow additional arcs if they do not overlap existing arcs.

Repeat until there are no more allowable arcs.

4
On

If none of the arcs crosses from fourth to first quadrant, we are reduced to the problem of maximizing line segments in $[0,2\pi]$, or more generally $[L,R]$. Here's a DP approach for that:

Sort segments in ascending order according to their right end, i.e., we have segements $[a_1,b_1],\ldots ,[a_n,b_n]$ with $L\le a_1\le b_1\le \ldots\le b_n\le R$. Let $c(k)$ be the profit obtained by using segment $k$ (or $1$ in the basic problem variant). For convenience, let $b_0=L$. Let $g(k)=\max\{\,j\mid b_j\le a_k\,\}$.

We want to compute $f(k)$, the optimal value optainable from picking segments only $\subseteq [L,b_k]$. Clearly $f(0)=0$ and otherwise $$f(k)=\max\{f(k-1),\max\{\,f(g(j))+c(j)\mid b_j=b_k\,\} $$ The final result is $f(n)$, of course.

Complexity: Sorting takes $O(n\log n)$ time. Computing $g()$ takes $O(n\log n)$ time and $O(n)$ memory. The final DP takes $O(n)$ meory and time.