Proximal Operator of the Euclidean Norm ($ {L}_{2} $ Norm) without Using Moreau Decomposition

3k Views Asked by At

I'm trying to solve the following proximal operator function (without using moreau decomposition, or conjugates, just straight up differentials):

$prox_{\lambda||\cdot||} = min_x \lambda||x||_2 + ||x-y||^2$

Took gradient, assuming the point y is not 0. (if it was, nondifferentiable)

Then, I got the following result:

$\frac{\lambda x}{||x||_2} + x - y = 0$

Trying to solve for x, it appears it's inseparable, so we can't solve for a single axis at once like the $L_1$ norm.

I then tried the following: $\frac{\lambda x_i}{||x||_2} = y_i-x_i \forall i$.

Squaring both sides so that the denominator looks nice:

$\frac{\lambda^2 x_i^2}{||x||^2_2} = (y_i-x_i)^2 \forall i$

Now we have:

$\frac{\lambda^2x_i^2}{y_i-x_i^2} = \sum_j x_j^2 \forall j$.

However, although we have n of these equations and n unknowns, the fact that it's not a set of linear equations makes me think that it's not possible to solve this any further. However, there is a solution for this in closed form without using the conjugate.

Any hints would be appreciated. Thanks!

2

There are 2 best solutions below

2
On BEST ANSWER

$\renewcommand{\Re}{\mathbb{R}}$ We need to solver the optimization problem $$ \mathrm{prox}_{\lambda\|{}\cdot{}\|}(v) = \mathrm{argmin}_{x\in\Re^n}\|x\| + \tfrac{1}{2\lambda}\|x-v\|^2. $$ For convenience and wlog, let $\lambda=1$. I assume that $x\neq 0$. The optimality condition we need to solve is \begin{align} \frac{x}{\|x\|} = v-x.\tag{1}\label{eq:1} \end{align} From \eqref{eq:1} we see that $x$ is parallel to $v$ (again, provided that $x\neq 0$). Indeed, $v=(1+1/\|x\|)x$. Let us assume that $x(v)$ has the parametric form $x(v) = \sigma(v)\cdot v$, where $\sigma:\Re^n\to\Re$. Substituting into \eqref{eq:1}, \begin{align} \frac{\sigma v}{|\sigma|\|v\|} = v-\sigma v,\tag{2} \end{align} From which we conclude that $\sigma(v) \cdot v$ may be either of the following candidates \begin{align} x(v) = \sigma v = \left(1 \pm \tfrac{1}{\|v\|} \right)v.\tag{3}\label{eq:3} \end{align} We plug in \eqref{eq:3} into \eqref{eq:1} and check whether it solves the optimality conditions (recall that there is a unique solution). We may verify that the following is indeed a solution \begin{align} x(v) = \left(1 - \tfrac{1}{\|v\|} \right)v,\tag{4}\label{eq:4} \end{align} but provided that $1 - \tfrac{1}{\|v\|}\geq 0$, that is $\|v\|\geq 1$.

1
On

This is just my approach to solve the question, but I have accepted Pantelis's answer.

Starting from the beginning:

$prox_{\lambda||\cdot||}(y) = argmin_x ||x|| + \frac{1}{2\lambda}||x-y||^2$.

Setting gradient to zero:

$\lambda\frac{x}{||x||} + (x-y) = 0$

$(\frac{\lambda}{||x||} + 1)x = y$

Take the norm of both sides:

$(\frac{\lambda}{||x||} + 1)||x|| = ||y||$

Treat $||x||$ as z:

$\lambda + z = ||y||$. $z = ||y|| - \lambda$.

We now know $||x|| = ||y|| - \lambda$, so plug back into the equation:

$(\frac{\lambda}{||y|| - \lambda} + 1)x = y$ and solve for the equation accordingly. It should give the same answer as what Pantelis said.