Is There an Example Where the Proximal / Prox Method Diverges?

219 Views Asked by At

This is a question that it is not homework but I would like clear.

I have this Proximal interpretation, that is: the solution of the problem is a fixed point of the following mapping: $$ x^{\ast} \in \ \ \arg \min_{x \in X_{\text{adm}}} \{\ \frac{1}{2}||x-x^{\ast}||^{2} + \gamma \ \ f(x) \} \ , \ \gamma >0$$

with $$X_{\text{adm}} = \{\ x \in \mathbb{R^n} | x \geq 0 \ , \ A_{\text{eq}}x=b_{\text{eq}}, A_{\text{ineq}}x=b_{\text{ineq}} \}$$ These $A$'s can be interpreted as constrains and it is compact and convex subset of $\mathbb{R^n}$.

Here I suppose that $f(x)$ is convex and differentiable with the gradient satisfying the Lipschitz condition.

Does this proximal method always converge? , i.e , does the proximal interpretation diverge?

I think that yes it divenges but maybe someone can hit me with some example?

I want to understand this because I'm going to study the Proximal Gradient method. Thanks for your help and time.

1

There are 1 best solutions below

0
On

The proximal minimization algorithm (iterating the mapping you wanted to describe) is the application to optimization of the so-called proximal point algorithm for finding zeroes of monotone operators.

Its convergence to a solution is ensured under basically no assumptions on the (nonzero) stepsize $\gamma$, as stated in Theorem 4 of Rockafellar, “Monotone operators and the proximal point algorithm”, 1976.

So there’s no counterexample showing divergence. You don’t even need smoothness of $f$ really.