Smoothness and the eigenvalues of the Hessian

2.1k Views Asked by At

A continuously differentiable function $f$ is $\beta$-smooth if the gradient is $\beta$-Lipschitz: $$ || \nabla f(x) - \nabla f(y)|| \leq \beta ||x-y||, $$ where $\nabla f(\cdot)$ denotes the gradient vector of $f$ at a given point, $\beta$ is a scalar, and $||\cdot||$ is an $\ell_2$-norm.

In the book Convex Optimization: Algorithms and Complexity by Sebastien Bubeck it is said that:

The above Lipschitz condition implies (if the function is twice differentiable) the eigenvalues of the Hessians being smaller than $\beta$.

I cannot see how.

1

There are 1 best solutions below

0
On BEST ANSWER

Consider $y=x+v$ with $v \in \mathbb{R}^n$ being any non-zero vector. Therefore, $$ \frac{||\nabla f(x+v)-f(x)||_2}{||v||_2} \leq \beta $$ As the derivative is a linear approximation, we have $$ \nabla f(x+v)=\nabla f(x)+\nabla^2f(x)v+r(v) $$ Here $\lim_{v \to 0} \frac{r(v)}{||v||_2}=0$. Rearranging, taking norms, dividing and the triangle inequality give $$ \frac{||\nabla^2f(x)v||_2}{||v||_2} \leq\frac{||\nabla f(x+v)-f(x)||_2}{||v||_2}+ \frac{||r(v)||_2}{||v||^2}\leq \beta+\frac{||r(v)||_2}{||v||^2} $$
Note that the Hessian is symmetric. Assumingt there exists a $v \in \mathbb{R}^n$ such that $\nabla^2f(x) v=\lambda v $ and $|\lambda| >\beta$, we can plug in $tv$ for $t\in \mathbb R \setminus \{ 0\}$ in the above inequality and obtain $$ |\lambda| \leq \beta+\frac{||r(tv)||_2}{||tv||^2} $$
Sending $t \to 0$ gives a contraditiction.