Now we consider the function $f:\mathbb{R}^n\to\mathbb{R}$ which is proper, convex, bounded from below, and has a nonempty minimum set $\mathcal{S}$ (we do not assume boundedness). Then, I am curious about the following questions:
- if a sequence $\{x_k\}$ satisfies $\lim_{k\to\infty}f(x_k) = f^*$, where $f^*$ is the minimum of $f$, is it true that $\lim_{k\to\infty} d(x_k,S) = 0$ or just $\liminf_{k\to\infty} d(x_k,S) = 0$? If not, is there a counterexample?
- whatif we consider the reverse of above?
- if we assume differentiability, what is the relation between convergence of function values, convergence of the distance to the minimum set, and convergence of the norm of gradient?
For the first question, "Convex Analysis" provides a theorem but assuming that the function is proper, convex, closed, and has no direction of recession. "directions of recession of a function $f$" are the nonzero vectors $y\in \mathbb{R}^n$ such that $f(x+\lambda y)$ is a nonincreasing function of $\lambda$ for every choice of $x$.
The answers to 1 and 2 are both NO. For counterexamples, see
in Bauschke-Combettes's Convex Analysis and Monotone Operator Theory in Hilbert Spaces, second edition.