Non Smooth Convex Optimization

408 Views Asked by At

I want to solve an optimization of the form $$\underset{x}{\min}f(x) + g(x),$$ where $f(x)$ is $\mu$-strongly convex and differentiable with a Lipschitz continuous gradient (with Lipschitz constant $L$), whereas $g(x)$ is convex, continuous, and $M$-Lipschitz, but not differentiable. I am using sub-gradient update of the form $$ x_{k+1}=x_k-\alpha_k(\nabla f(x_k) + z_k), $$ where $z_k\in \partial g(x_k)$, and $\left\|z_k\right\|\leq M$ for all $k$. In my case, I cannot compute the proximal operator for $g$. What kind of convergence guarantees and convergence rates can be shown for this problem?

1

There are 1 best solutions below

0
On

If you have an upper estimate for the minimal value of $f+g$ you could use a subgradient projection algorithm (e.g. section 29.6, Bauschke & Combettes' 2017 book). Suppose $C = \{x\ | \ f(x) + g(x) \leq \xi\} \neq \varnothing $ for some estimate $\xi \in \mathbb{R}$. Let $s(x)$ be any selection of $\partial(f+g)$. For instance, you could iterate $$ x_{n+1} = \begin{cases} x_n + \frac{\xi - (f(x_n) + g(x_n))}{\|s(x_n)\|^2}s(x_n) &\mbox{if }f(x_n)+g(x_n) > \xi\\ x_n, &\mbox{otherwise.} \end{cases} $$

This algorithm guarantees weak convergence to a point in $C$ with your hypotheses.

BTW, this setup would be perfect for Douglas Rachford splitting if you had a reasonable way to approximate $\text{prox}_g$. Douglas-Rachford also has nice convergence guarantees.