I am wondering if there is an efficient (perhaps closed form) way to solve the following piecewise quadratic minimisation problem: $$ \min_{\mathbf{x}} \sum_{i=1}^n \min\left[ (\mathbf{c}_i^T\mathbf{x}-a_i)^2, (\mathbf{d}_i^T\mathbf{x}-b_i)^2 \right] $$ It is a combination of linear least squares problems where there are two possible solutions for each $i$ and we would like to take the smaller of the two.
I can see that one approach would be to solve all possible combinations of the two terms as separate linear least squares problems. E.g. solve: $$ \begin{bmatrix} \mathbf{c}_1^T \\ \vdots \\ \mathbf{c}_n^T \end{bmatrix}\mathbf{x}= \begin{bmatrix} a_1 \\ \vdots \\ a_n \end{bmatrix},\ \ \begin{bmatrix} \mathbf{d}_1^T \\ \mathbf{c}_2^T \\ \vdots \\ \mathbf{c}_n^T \end{bmatrix}\mathbf{x}= \begin{bmatrix} b_1 \\ a_2 \\ \vdots \\ a_n \end{bmatrix},\ \ \dots\ \ \begin{bmatrix} \mathbf{d}_1^T \\ \vdots \\ \mathbf{d}_n^T \end{bmatrix}\mathbf{x}= \begin{bmatrix} b_1 \\ \vdots \\ b_n \end{bmatrix} $$ and take the solution with minimal residual. However, there will be $2^n$ such systems to solve so if $n$ is large this may not be practical.
I'm actually interested solving a particular version of this problem where $\mathbf{x}\in\mathrm{R}^3$ and the following conditions hold: $$ \forall i, \|\mathbf{c}_i\|=\|\mathbf{d}_i\|=1 \land \mathbf{c}_i = \begin{bmatrix} -1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 1 \end{bmatrix}\mathbf{d}_i \land a_i = b_i>0 $$ If I plug my problem into a nonlinear least squares optimiser (lsqnonlin in Matlab), it quickly converges to (one of) the global optima (for my particular problem there are two global optima) regardless of initialisation. I've also noticed that the two global optima are related by: $$ \mathbf{x}_1^* = \begin{bmatrix} -1 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 1 \end{bmatrix}\mathbf{x}_2^* $$ I would like to gain some more insight into the problem. What is the best way to go about solving this problem? Although seemingly nonconvex, why does lsqnonlin always converge to one of the two global optima?
This problem is NP-hard. Note that you have $n$ binary choices to make, and these binary choices (choosing the minimum for each $i$) need to be made collectively.
Therefore, there are $2^n$ possible least-squares problems that need to be solved, and you'll have to choose one of them that gives you the global minimum.
However, if you had a $\max$ operator inside your summation, then you would have a convex optimization program with quadratic constrains, that could also be formulated as semi-definite program (SDP) (which is also convex, and can be solved efficiently.) I see that you have a different objective though.