optimality condition proof

300 Views Asked by At

Let $f:\mathbb{R}^{n}\rightarrow\mathbb{R}$ be a twice continuously differentiable function over the convex set $X$. Show:

  1. If $x^*$ is a local minimizer, then

    $(x-x^{*})^{T}Hf(x^{*})(x-x^{*})\geq0$

    for all $x$ such that $\nabla f(x^{*})^{T}(x-x^{*})=0$.

  2. If $\nabla f(x^{*})^{T}(x-x^{*})^{T}\geq0$ for all $x\in{X}$, then $x^*$ a local minimizer.

MY ATTEMPT:

  1. I think I have to use the second-order Taylor expansion:

    $f(x)=f(x^{*})+\nabla f(x^{*})^{T}(x-x^{*})^{T}+\dfrac{1}{2}(x-x^{*})^{T}Hf(x^{*})(x-x^{*})+o\left(\left\Vert x-x^{*}\right\Vert ^{3}\right)$

    Since $\nabla f(x^{*})^{T}(x-x^{*})=0$, then $f(x)=f(x^{*})+\dfrac{1}{2}(x-x^{*})^{T}Hf(x^{*})(x-x^{*})+o\left(\left\Vert x-x^{*}\right\Vert ^{3}\right)$.

    And now?

1

There are 1 best solutions below

0
On

$\renewcommand{\Re}{\mathbb{R}}$Let us start with the second question. Assume that $f$ is continuously differentiable and $x^*\in X$ satisfies

$$\begin{aligned} \nabla f(x^*)^\top (x-x^*) \geq 0, \end{aligned}$$

for all $x\in X$. Then take $d\in\Re^n$ with $\|d\| = 1$ so that for $\alpha\in (0,\alpha')$ - for some $\alpha'>0$ - it is $x^*+\alpha d\in X$. Then, using the first-order expansion of $f$ at $x^*$

$$\begin{aligned} f(x^*+\alpha d) = f(x^*) + \nabla f(x^*)^\top\alpha d + o(\alpha) \end{aligned}$$

and since $\alpha \nabla f(x^*)^\top d \geq 0$ and $o(\alpha)$ goes to zero faster than $\alpha \nabla f(x^*)^\top d$ (by definition), we may find $\bar\alpha$, so that for all $\alpha \in (0, \bar\alpha)$ we have

$$\begin{aligned} f(x^*+\alpha d) \geq f(x^*). \end{aligned}$$

This means that $x^*$ is a local minimum of $f$.

The first condition can be proven using the second-order expansion of $f$ which is:

$$\begin{aligned} f(x^* + \alpha d) = f(x^*) + \alpha \nabla f(x^*)^\top d + \tfrac{\alpha^2}{2} d^\top \nabla^2 f(x^*) d + o(\alpha^2). \end{aligned}$$