Is Proximal Gradient Method (PGM) suitable for solving matrix optimization problems?

123 Views Asked by At

Generally, the well-known Compressed Sensing (CS) task can be modeled by the following vector optimization problem with a pre-defined convex regularizer $\mathcal{R}(\cdot)$:

$$ \underset{\mathbf{\hat{x}}}{\min}\frac{1}{2}\lVert \mathbf{A\hat{x}}-\mathbf{y} \rVert _{2}^{2}+\lambda \mathcal{R}\left( \mathbf{\hat{x}} \right) , $$

where $\mathbf{\hat{x}}\in\mathbb{R}^{N}$ is the target signal, $\mathbf{A}\in\mathbb{R}^{M\times N} (M\ll N)$ is the measurement matrix, $\mathbf{y}\in\mathbb{R}^{M}$ is the measurement, and $\lambda \in \mathbb{R^+}$ is the regularization parameter.

The above optimization problem can be solved iteratively by Proximal Gradient Method (PGM) which contains the following two steps:

$$ \mathbf{z}^{(k)}=\mathbf{\hat{x}}^{(k)}-\rho^{(k)}\mathbf{A}^\top(\mathbf{A}\mathbf{\hat{x}}^{(k)}-\mathbf{y}),\\ \mathbf{\hat{x}}^{(k+1)} =\arg\min_{\mathbf{\hat{x}}}\frac{1}{2}\lVert \mathbf{\hat{x}}-\mathbf{z}^{\left( k \right)} \rVert _{2}^{2}+\lambda \mathcal{R}\left( \mathbf{\hat{x}} \right), $$

where $k$ denotes the PGM iteration index, and $\rho^{(k)}$ is the step size. The first step is a trivial gradient descent step, while the second step is the so-called proximal mapping which is generally denoted as $\mathbf{prox}_{\lambda \mathcal{R}}(\cdot)$.


My question is, does PGM can be directly extended to the following matrix optimization problem:

$$ \underset{\mathbf{\hat{X}}}{\min}\frac{1}{2}\lVert \mathbf{A\hat{X}}-\mathbf{Y} \rVert _{F}^{2}+\lambda \mathcal{R}\left( \mathbf{\hat{X}} \right) , $$

where $\mathbf{\hat{X}}\in\mathbb{R}^{N\times C}$ is the target signals, and $\mathbf{\hat{Y}}\in\mathbb{R}^{M\times C}$ is the measurements?

  1. Is this obvious? Or how to make a simple but complete proof?

  2. For this matrix optimization problem, does there exists a better solution?


I am a beginner of convex optimization theory and waiting for your answers or tips, thank you very much :) ~