How to use second-order Gateaux derivative for convex optimization?

116 Views Asked by At

Consider a functional $f:V \to \mathbb{R}$ where $V$ is some vector space.

The first-order Gateaux derivative of $f$ at point $v$ in the direction $u$ is defined as follows:

\begin{align} \Delta_u f(v) = \frac{d}{d \epsilon} f( (1-\epsilon) v +\epsilon u) \Big |_{\epsilon =0} . \end{align}

Note that this definition is slightly different from the conventional one but nonetheless useful. For example, if $f$ is concave, that $v$ is a global maximum if and only if

\begin{align} \Delta_u f(v) \le 0 , \forall u. \end{align}

This property, for example, can be found in the book by Luenberger. Similarly, we can define second-order Gateaux derivative as

\begin{align} \Delta_u^2 f(v) = \frac{d^2}{d \epsilon^2} f( (1-\epsilon) v +\epsilon u) \Big |_{\epsilon =0} . \end{align}

My questions: Suppose that $f$ is concave. Do we have any theorems regarding $\Delta_u^2 f(v)$ that characterize the global maximum of $f$? For example, do we have something similar to the second derivative test? More generally, how can we use $\Delta_u^2 f(v)$ in convex optimization? Any reference will also be appreciated.