Euclidean Projection onto the $ {L}_{2, 1} $ Norm

3.3k Views Asked by At

I follow the notation in Chambolle and Pock 2011 24page.

In that document, the convex set $P$ is $$ P=\{p\in Y \mid \|p\|_\infty \leq 1\} $$

where $\|p\|_\infty = \max_{i,j}|p_{i,j}|$ and $|p_{i,j}| = \sqrt{\partial_xp^2 + \partial_yp^2}$. The indicator function $\delta_P(p)$ is $0$ if $p\in P$ and $+\infty$ if $p\notin P$.

The paper says that the proximal operator reduces to pointwise Euclidean projectors onto $L^2$ balls. $$p_{i,j}^{k+1} = \frac{p_{i,j}^k}{\max(1,|p_{i,j}^k|)} $$


My question is why $L^2$ balls, not $L^\infty$ balls? I am very confused about this.

1

There are 1 best solutions below

0
On

Every $p_{ij}$ is a vector $\in \mathbb{R}^2$ that contains the discrete gradient at pixel $(i,j)$. The projection operation (the proximal operator) is decomposable at every pixel because

$$p^{k+1} = \arg \min_{p \in P} \Vert p^k - p \Vert^2 = \arg \min_{ \Vert p_{ij} \Vert \le 1} \sum_{ij} \Vert p_{ij}^k - p_{ij} \Vert^2 $$ as the $\ell_\infty$ norm guarantees that every single $p_{ij}$ is in the unit $\ell_2$ norm ball. Therefore we get

$$ p_{ij}^{k+1} = \arg \min_{\Vert p_{ij} \Vert \le 1} \Vert p_{ij}^k - p_{ij} \Vert^2 $$

i.e. we want to project the input point $p_{ij}^k$ at iteration $k$ onto the $\ell_2$ norm ball and the result is as you posted above in the question.