Does convergence of iterates imply convergence of function values?

183 Views Asked by At

The question came to my find when I was reading convergence of gradient descent. However, my question is general and does not necessarily stick to GD. Concretely,my question is: \begin{equation} \|x^k-x^*\|_2 \rightarrow 0 \implies f(x^k)-f(x^*) \rightarrow 0 \quad \text{and vice versa}? \end{equation} Assume whatever is necessary like convexity, lipschitz etc for iterative algorithms. An example/counter example would be great.

1

There are 1 best solutions below

0
On BEST ANSWER

If $f$ is continuous, the implication $\|x^k-x^*\|_2 \rightarrow 0 \implies f(x^k)-f(x^*) \rightarrow 0$ holds; this is known as a sequential characterization of continuous functions.

To prove the converse, additional hypotheses on $f$ are needed. Suppose $f$ is strictly convex and $x^*$ is the point of its minimum. Then it is true that $f(x^k)-f(x^*) \rightarrow 0 \implies \|x^k-x^*\|_2 \rightarrow 0 $. To prove this, consider the quantity $$ m(r) = \min_{\|x-x^*\|=r} (f(x)-f(x^*)),\quad r>0 $$ Since $f$ is strictly convex, its point of minimum is unique; therefore $m(r)>0$ for any $r>0$. Convexity also implies that $f(x)-f(x^*)\ge m(r)$ when $\|x-x^*\|\ge r$. Rephrase this as: $$f(x)-f(x^*)< m(r) \implies \|x-x^*\| <r$$ and you have the conclusion.