How to solve such optimization problem \begin{align} \min_{x \in \mathbb{R}^n} \ f(x) + \sum_{i=1}^m g_i(x) \, \end{align} where $f(x)$ and $g_i(x)$ are closed proper convex functions. We can also assume that $f(x)$ and $g_i(x)$ are "proximable" functions [1][2].
please note that it is a large-scale optimization problem. So, what algorithms would you suggest to solve such a problem (besides consensus ADMM)?
Thank you so much in advance.
Your problem is optimizing $$\min_x \left\{ \sum_{i=1}^m g_i(x) : ||Ax-b||_2^2 \leq 1 \right\}.$$ Since all $g_i$ are quadratic forms, their sum is also a quadratic form. You therefore optimize one quadratic form subject to one quadratic constraint.
The scale of your problem ($n=8096$) is small. The Hessian only takes 512 MB of memory. Newton-based methods (such as sequential quadratic programming or interior point methods) will have no problem with this formulation, and are probably much faster than ADMM. Give CPLEX or Gurobi a try.