Let $\mathbb{S}^m$ be an arbitrary closed, compact, convex set in $\mathbb{R}^m$. Let $\mathbf{x}=(x_1,\dots,x_m)$ denote a point in $\mathbb{S}^m$. Then, I define the function \begin{align} f(\mathbf{x})=\max_{i}\{x_i\},~~\mathbf{x}\in \mathbf{S}^m \end{align} on the set $\mathbb{S}^m$. Consider the optimization problem \begin{align} \min_{\mathbf{x}\in \mathbb{S}^m}~f(\mathbf{x}) \end{align}
My question is
- Is the optimization problem above a convex optimization problem?
- In general, what can one comment about its solution.
Since the function $f(x) = \max \{x_1,...,x_n\}$ is convex, the problem is a convex problem.
Without knowing more about $S^m$, it is hard to comment on the solution. It may not have a solution, for example, if $S^m = \mathbb{R}^m$, or if $S^m$ is open.
Note: Let $I(x) = \{ i | x_i = \max_j x_j \}$ (the 'active' indices), and suppose $\hat{x}$ is a solution, then for some neighborhood of $\hat{x}$, we have $I(x) \subset I(\hat{x})$, and for $x$ in this neighborhood we can write $\max_i x_i = \max_{i \in I(\hat{x})} x_i$, and we have $\min_{x \in S^m} \max_i x_i = \min_{x \in S^m} \max_{i \in I_{\hat{x}}} x_i $.
If $|I(\hat{x})| =1$, then the solution lies on the left/bottom/whatever-the-right-term-is most part of $S^m$. If $|I(\hat{x})| =m$, then the solution lies on the line $x_1 = ... = x_m$.
The issues with this approach is that you hit some combinatorial issue that doesn't really surface in $\mathbb{R}^2$.