Minimize the sum of two convex functions s.t. linear constraint, when we know the solution to each

282 Views Asked by At

Suppose $x$ is $n$-dimensional real, and $f(x)$ and $g(x)$ are real and convex R1.

I want to find $x$ to minimize $Q= f(x)+g(x)$ s.t. $a’x=b$, for $a$ $N \times 1$ real vector and constant b. Let's call this solution $X^*$. How is $X^*$ related to the two partial solutions $X_1$ and $X_2$ that optimize the following?

  • $X_1$ is the solution to $\min Q_1 = f(x)$ s.t. $a’x=b$.

  • $X_2$ is the solution to $\min Q_2 = g(x)$ s.t. $a’x=b$.

If no direct/simple connection, can i use $X_1$ and $X_2$ in an efficient manner in an interative optimization scheme, such as ADMM or augmented Lagrangian, to arrive at $X^*$ ?