Let me start by saying I know almost nothing about optimization so please bear with me. Basically, I am wondering whether it is possible to solve a problem with two constraints by solving the problem with each constraint individually and somehow combining the solutions. However, I'm having trouble visualizing the scenario. More specifically:
Suppose $V$ is some Banach space and $f:V \to \mathbb{R}$ is convex and continuous. Furthermore, let $K_1, K_2 \subset V$ be convex and consider the solutions $x_i$ of the optimization problems $$\quad \min_{x \in K_i} f(x).$$
For $K = K_1 \cap K_2$ let $x$ solve $$\min_{x \in K} f(x).$$
- Is $x$ the projection of $x_1$ onto $K_2$?
- Is $x$ a linear combination of $x_1$ and $x_2$?
- Are there any relations between $x$, $x_1$ and $x_2$ at all?
- Does the situation change if we further restrict $f$? For example to be quadratic?
Thank you very much in advance.
For general convex sets, the sorts of simple relationships you're suggesting don't exist. I find it helps to draw two dimensional pictures to see why. (Even if the space is dimension higher than 2, we could restrict the problems to the plane spanned by $x_1,x_2,x$ and none of the answers to your questions would change.) Here's an example of what can happen even in the simple case where $f$ is linear:
The red arrow is the direction of the negative gradient of $f$. Minimizing $f$ means moving as far as possible in that direction. To address your specific questions:
That said, there are methods (like ADMM) that can use solvers for subproblems to converge to a solution of a larger problem.