Given a real vector $X=[x_1,x_2,x_3]$, I need a cost function that minimizes the largest element of $X$ such that:
min maximum($x_1,x_2,x_3$).
The closest thing I can think of is using the L2 norm and
min ($x_1^2+x_2^2+x_3^2$)
But this would not make a difference if all elements of $X$ have the same small value or $x_1=x_2=0$ and $x_3$ is really large.
The cost function should be differentiable so that I can solve for the gradient analytically (I am able to do this individually for each component of $X$).
In a different problem I also want to be able to solve
max minimum($x_1,x_2,x_3$).
Any ideas or hints? Thanks!
You can make $\text{max}(x_1,x_2,x_3)$ the cost function. In order to make it differentiable, define a new variable $y$ such that $y = \text{max}(x_1,x_2,x_3)$, and add suitable constraints, which happen to be linear (and differentiable), in order to enforce the desired relation.
Specifically,
min y, subject to $y \ge x_1, y \ge x_2, y \ge x_3$
Because $y$ is being minimized, the optimization will force $ y $ to be equal to $\text{max}(x_1,x_2,x_3)$ at the optimum. You can add additional terms to the objective function, and additional constraints if you wish.
max min can be handled in an analogous manner (reverse the direction of the inequality constraints).