Infinity norm minimization

439 Views Asked by At

I am wondering how to minimize an objective function of the following form:

$$\min_{\mathbf{x}\in\mathcal{R}^{MN}} \|\mathbf{x}-\mathbf{y}\|_\infty + \lambda\mathrm{TV}(\mathbf{x})$$

Here, $\mathrm{TV}(\mathbf{X})$ is the total variation in an image, given as $V(y)$ here (Wikipedia).

For example, could I take $\mathbf{z} = \mathbf{x} - \mathbf{y}$ and say: $$\min_{\mathbf{z}\in\mathcal{R}^{MN}} \mbox{TV}(\mathbf{z}+\mathbf{y}) \mbox{ s.t. } ||\mathbf{z}||_\infty \leq \beta$$

which is equivalent to

$$\min_{\mathbf{z}\in\mathcal{R}^{MN}} \mbox{TV}(\mathbf{z}+\mathbf{y}) \mbox{ s.t. } |z_i| - \beta \leq 0 \:\:\:\: \forall \: i$$

Could I then plug the constraint into a penalty function such as quadratic-logarithmic function to create a penalty aggregate and solve using the penalty method? Can methods such as gradient descent or the penalty aggregate method for the constrained version of this problem be used to solve this?

Thanks!

1

There are 1 best solutions below

0
On

Both functions are proper, convex lower semicontinuous and their proximal operators can be computed, hence this problem can be solved with the primal dual algorithm of Chambolle and Pock ("A first-order primal-dual algorithm for convex problems with applications to imaging", 2001).