Global optimum in a convex space/set

523 Views Asked by At

Recall that a set $S⊂\mathbb R^n$ is said to be convex if for any $x,y∈S$, and any $λ∈[0, 1]$, we have $λx+(1−λ)y∈S$.
Let $f:\mathbb R^n→\mathbb R$ be a convex function and let $S⊂\mathbb R^n$ be a convex set. Let $x^∗$ be an element of $S$. Suppose that $x^∗$ is a local optimum for the problem of minimizing $f(x)$ over $S$, that is, there exists some $ε > 0$ such that $f(x^∗) \leqslant f(x)$ for all $x ∈ S$ for which $\|x−x^∗\| \leqslant ε$. Prove that $x^∗$ is globally optimal, that is, $f(x^∗) \leqslant f(x)$ for all $x ∈ S$.

I have been looking at this problem for a while now and I feel like I can kind of picture it in my head, but am having a hard time putting it into words. Can anyone help me out?

2

There are 2 best solutions below

0
On

Suppose $x^*$ is not a global minimizer; there is some $x$ with $f(x) < f(x^*)$. Then every point $y$ on the line segment from $x^*$ to $x$ has $f(y) < f(x^*)$, too. Points on the line segment get arbitrarily close to $x^*$, so $x^*$ is not a local minimizer, either.

To prove the inequality $f(y) < f(x^*)$, apply convexity.

1
On

Here is the more detailed version of a proof by contradiction:

Assume that $f(x^{\star} )$ is not global:

  • So, there is $\xi$ with $f(\xi ) < f(x^{\star} )$.
  • Because of convexity you have $x(t) =t\xi + (1-t)x^{\star} \in S$ for all $t \in [0,1]$.
  • $||x^{\star} - x(t)|| = ||x^{\star} - (t\xi + (1-t)x^{\star})|| = ||t(x^{\star} - \xi)|| = t||x^{\star} - \xi ||$
  • $\Rightarrow t||x^{\star} - \xi || < \varepsilon$ for $t < \frac{\varepsilon}{||x^{\star} - \xi ||}$

Now, for any $0<t_0 < \frac{\varepsilon}{||x^{\star} - \xi ||}$ you have

$$\color{blue}{f(x^{\star}) \leq} f(x(t_0)) =f(t_0\xi + (1-t_0)x^{\star}) \leq t_0f(\xi) + (1-t_0)f(x^{\star})$$ $$\color{blue}{\stackrel{f(\xi ) < f(x^{\star} )}{<}} t_0f(x^{\star}) + (1-t_0)f(x^{\star}) = \color{blue}{f(x^{\star})} \mbox{ Contradiction!}$$