Alternating projection algorithm fails to find an intersection

222 Views Asked by At

The projection onto convex sets (POCS, alternating projections) algorithm is a method to find the intersection of convex sets by projecting onto them consecutively. It has been proven that this method converges for any starting point $x_0$.

In the example below I try to apply the algorithm and find that it converges to a local minimum:

Local minimum pocs

To determine the point $x_{i+1}$ on the ellipse $E_2$ starting from a point $x_i$ on the ellipse $E_1$, I find the closest point to $x_i$ on $E_2$. The final point of this procedure is $x_N$. This shows that the algorithm converged to a local minimum, yielding the local minimal distance $|x_N-x_{N-1}|$ (you can also picture two rectangles to see the same issue, and it will converge after 1 iteration).

Had I chosen the point $y_1$ to start from, I would have found the nearest intersection. Alternatively, had I gone through the path of the dashed line (projecting into the point on $E_1$ that is further away) instead of selecting the closest point on ellipse $E_1$, I would have found an intersection as well.

In the literature I find no mention of ending up in local minima and it seems to contradict the proof.

Can someone explain to me what is going on and how I can make sure that I always find an intersection if it exists?

1

There are 1 best solutions below

1
On

I misunderstood the objective of the algorithm. I intended to find a point where the boundaries intersect. Instead, the algorithm finds a point in the overlap of $E_1$ and $E_2$. Thus, it will converge at $x_1$ already.