Half quadratic splitting (alternating optimization with penalty)

3.5k Views Asked by At

Hello I'm working on an optimization problem:

$$\hat{x}=\text{arg min}_{x} \frac{1}{2}\parallel y - Hx \parallel^{2} + \lambda \Phi (x), \quad x, y \in \mathbb{R}^{N\times M}.$$

where $H$ is a matrix and $\Phi$ an application.

To solve this problem, my idea is to split in two subproblems like in ADMM (http://stanford.edu/~boyd/papers/pdf/admm_distr_stats.pdf).

The problem of minimizating the before equation is equivalent to

$$\hat{x}=\text{arg min} _{x} \frac{1}{2}\parallel y - Hx \parallel^{2} + \lambda \Phi (z)\quad \text{s.t.}\quad z=x, \qquad z, x \in \mathbb{R}^{N\times M}.$$

The method of HQS (half splitting quadratic) seeks to minizmize the following cost function:

$$L_{\mu}(x,z) = \frac{1}{2}\parallel y-Hx\parallel^{2} +\lambda \Phi(z) + \frac{\mu}{2} \parallel z-x\parallel^{2}, \qquad z, x \in \mathbb{R}^{N\times M}, \mu\in\mathbb{R^{+}} .$$

And we apply the HQS method, that solves it iterativetely:

\begin{equation} \left\{ \begin{aligned} x_{k+1} &= \text{arg min} _{x} {\frac{1}{2}}\parallel {y-}Hx \parallel^{2} + \mu \parallel x-z_{k}\parallel^{2}, \\[1pt] z_{k+1} &= \text{arg min} _{z} \frac{\mu}{2} \parallel z-x_{k+1}\parallel^{2} + \lambda \Phi (z). \end{aligned} \right. \end{equation}

My question is: Is this equivalent to the original problem? for convergence and obtaining a solution of the original problem, what do we have to supose about $\mu$?

And the most important, which convergence results are there? Any textbook to learn this?