How to solve $\min_{x \in \mathbb{R}^n} \ f(x) + \sum_{i=1}^m g_i(x)$ besides consensus ADMM?

97 Views Asked by At

How to solve such optimization problem \begin{align} \min_{x \in \mathbb{R}^n} \ f(x) + \sum_{i=1}^m g_i(x) \, \end{align} where $f(x)$ and $g_i(x)$ are closed proper convex functions. We can also assume that $f(x)$ and $g_i(x)$ are "proximable" functions [1][2].


please note that it is a large-scale optimization problem. So, what algorithms would you suggest to solve such a problem (besides consensus ADMM)?

Thank you so much in advance.

2

There are 2 best solutions below

4
On

Your problem is optimizing $$\min_x \left\{ \sum_{i=1}^m g_i(x) : ||Ax-b||_2^2 \leq 1 \right\}.$$ Since all $g_i$ are quadratic forms, their sum is also a quadratic form. You therefore optimize one quadratic form subject to one quadratic constraint.

The scale of your problem ($n=8096$) is small. The Hessian only takes 512 MB of memory. Newton-based methods (such as sequential quadratic programming or interior point methods) will have no problem with this formulation, and are probably much faster than ADMM. Give CPLEX or Gurobi a try.

0
On

Look at Quadratic form of the Hybrid Steepest Descent (See Quadratic Optimization of Fixed Points of Non Expensive Mappings in Hilbert Space).

There are some methods there with little memory requirements.

I used them in my answer to Orthogonal Projection onto the Intersection of Convex Sets as an alternative to ADMM Consensus Trick.