Gradient of Moreau-Yosida Regularization

3.1k Views Asked by At

Let $f(x):\Re^n\rightarrow \Re$ be a proper and closed convex function. Its Moreau-Yosida regularization is defined as

$F(x)=\min_y\Big(f(y)+\frac{1}{2}\|y-x\|_2^2\Big)$

$\operatorname{Prox}_f(x)=\arg\min_y \Big(f(y)+\frac{1}{2}\|y-x\|_2^2\Big)$

Lots of literature say $F(x)$ is Lipschtiz continuous and give explicitly the expression of $\nabla F(x)$ involving $\operatorname{Prox}_f(x)$. But I have no idea how to calculate $\nabla F(x)$. Can anyone provide a straightforward method? I know Rockafellar's book gives a proof. But it assumes too much prior knowledge. I am wondering if there is a more elementary method to prove the Lipschtiz continuity and calculate its gradient.

1

There are 1 best solutions below

0
On

You will find the proof of the Lipschitz continuity of $F$ here.

You will not find $\nabla F$ in a straightforward way, i.e., without solving some nonlinear equations. Consider $\nabla f$ as a map from $\mathbb R^n$ to $\mathbb R^n$. Let $\psi$ be the inverse map to $\nabla f$. Then $\nabla F$ is the inverse of $\psi+\mathrm{id}$. Indeed, $\psi$ is the gradient of the Legendre transforms of $f$, which we can call $g$. The Legendre transform converts the infimal convolution of $f$ and $\frac12\|x\|^2$ into the sum of $g$ and $\frac12\|y\|^2$. Then we must take the transform again to return to the $x$-variable. In terms of the gradients this means taking inverses.