Minimizing difference and individual variables in convex problem

153 Views Asked by At

Let's say I have the following optimization problem:

$$ \begin{align*} \min_{\mathbf{x},\mathbf{y}} & \sum_i x_i-y_i \\ \mathrm{s.t.} & \{\mathbf{x},\mathbf{y}\} \in C(\mathbf{x},\mathbf{y}), \end{align*} $$ where $\mathbf{x}$ and $\mathbf{y}$ are vectors, $x_i$ and $y_i$ are the $i^{\mathrm{th}}$ elements of these vectors, and $C(\mathbf{x},\mathbf{y})$ is a well-defined convex set in these variables.

I want to minimize this difference $\sum_i \left(x_i - y_i\right)$, but also want to make sure that $\mathbf{x}$ is also the smallest it can be element-wise. In other words, if there are many $\{\mathbf{x},\mathbf{y}\}$ that minimize the difference and that are feasible, I'd like to obtain with the element-wise smallest $\mathbf{x}$.

I thought about this, and it would seem to me that this is naturally true, but I'm not positive. Any insights on this? If not, how can I guarantee this property?

1

There are 1 best solutions below

0
On BEST ANSWER

If you know that $\sigma = \min f(\mathbf{x})$ subject to $\mathbf{x}\in C$and you want to want to minimize $g(\mathbf{x})$ for all such $\mathbf{x}$, I think all you have to do is

$$ \min_{\mathbf{x}} g(\mathbf{x}) \;\;\; \mathrm{s.t.} \;\;\; f(\mathbf{x})=\sigma \wedge \mathbf{x} \in C. $$

So, if you can solve the original problem and get the minimum value of the function, you can then use that to minimize $g(\mathbf{x})$ over all possible solutions by solving the second optimization problem above.

There is also a somewhat related post here.