Find a point on $S$ with minimum distance to $x \in \mathbb{R}^n$

113 Views Asked by At

Let $S = \{x \in \mathbb{R}^n | |x_{i+1} - x_i | \leq 1\}$, i.e. each successive component differs from the last by at most one. Then $S$ is convex but unbounded. Given some other point $x \in \mathbb{R}^n$, I would like to find the point $s \in S$ that minimizes its euclidean distance to $x$.

I am looking for an algorithm that ideally takes into account the specific structure of $S$, and ideally can be performed in time polynomial in $n$.

Thanks!

1

There are 1 best solutions below

0
On BEST ANSWER

Like @LinAlg said, it indeed is your lucky day, but it is even luckier than you thought :)

Let $D$ be the 'forward-difference' matrix, that is, $D x$ is the vector $(x_2 - x_1, x_3 - x_2, \dots, x_n - x_{n-1})^T$. The problem you are attempting to solve is $$ \min_{s \in \mathbb{R}^n} \quad \|s - x\|_2^2 \quad \text{ s.t. }\quad \|D s\|_\infty \leq 1 $$

Note, that the matrix $D$ very sparse. You have the following options:

  1. Noting that $\|D s\|_\infty \leq 1$ if and only if $-\mathbb{1} \leq D s \leq \mathbb{1}$, where $\mathbb{1}$ is the vector $\mathbb{1} = (1, \dots, 1)^T$. Therefore, you have a quadratic objective subject to linear constraints, and you can use any QP solver which can take sparse constraint matrices. Hopefully, it will be able to exploit the structure of the matrix. You can also minimize the norm instead of norm-squared, and use a SOCP solver.
  2. If the generic QP or SOCP solver is slow, you can use a first-order method, such as ADMM. Defining $$ g(t) = \begin{cases} 0 & \|t\|_\infty \leq 1 \\ +\infty & \text{otherwise}\end{cases} $$ Your problem can be formulated as $$ \min_{s, t} \quad 0.5 \|s - x\|_2^2 + g(t) \quad \text{ s.t. }\quad D s = t $$ The Augmented Lagrangian is $$ L(s, t, \lambda; \rho) = 0.5 \|s - x\|_2^2 + g(t) + (D^T \lambda)^T s - \lambda^T t + 0.5 \rho \|D s - t\|_2^2 $$ The major computational steps are minimization of $L$ over $t$ and $s$, separately. Minimizing over $t$ is trivial using Cauchy-Schwartz. If you need help, please tell me, but is is very simple. Minimizing over $s$ is a simple unconstrainted quadratic problem, similar to Least Squares, and is solved by solving a linear system whose matrix is $P = I + \rho D^T D$. Because of the special structure of the matrix $D^T D$, you can solve a linear system of the form $P z = b$ in $O(n \log n)$ time using the well-known Discretr Cosine Transform. See the excellent article by Gilbert Strang called The Discrete Cosine Transform. So, to summarize, you have an iterative algorithm whose iterations are performed in $O(n \log n)$ time, and it converges rapidly due to the strong-convexity of $\|s - x\|_2^2$.