Does approximate local optimality imply approximate global optimality in convex optimization?

54 Views Asked by At

Consider the convex constrained optimization problem $$ \min_{x\in \mathcal C} \; f(x) $$ where $\mathcal C$ is a closed convex set and $f(x)$ is convex. Suppose I know that for some $\bar x$, $$ \nabla f(\bar x)^T (\bar x - x) \leq \epsilon, \; \forall x \in \mathcal C \text{ and } \|x-\bar x\|_2 \leq \delta $$ for some small $\epsilon > 0$, $\delta > 0$. Can I generalize this condition globally, e.g. does this imply that $$ \nabla f(\bar x)^T (\bar x - x) \leq \epsilon, \; \forall x \in \mathcal C? $$ Note that this is true if $\epsilon = 0$, since this would be equivalent to saying "local optimality implies global optimality". But does it hold for approximate optimality?

1

There are 1 best solutions below

3
On BEST ANSWER

By the Cauchy-Schwarz inequality, $$\nabla f(\bar{x})^T(\bar{x}-x)\leq |\nabla f(\bar{x})^T(\bar{x}-x)|\leq ||\nabla f(\bar{x})||_2||\bar{x}-x||_2$$ and so given any $\bar{x}\in\mathcal{C}$ and $\varepsilon>0$, it is always possible to choose $\delta>0$ (depending on $\bar{x}$ and $\varepsilon$) so that $x\in \mathcal{C}$ and $||\bar{x}-x||_2\leq \delta$ implies that $\nabla f(\bar{x})^T(\bar{x}-x)\leq\varepsilon$. And of course in general you cannot expect the inequality $\nabla f(\bar{x})^T(\bar{x}-x)\leq\varepsilon$ to hold for all $x\in\mathcal{C}$.