Frobenius norm minimization subject to diagonal constraints

71 Views Asked by At

Edit: I have now reformulated the problem in a more readable way.

I have a not so complicated minimization program I need a bit help with.

I am given three positive semidefinite matrices: $R, \Gamma, S \in \mathbb{C}^{N \times N}$.

I am trying to minimize the following original objective function for $\boldsymbol{d}, \boldsymbol{a}$ and $\beta$: \begin{equation*} \begin{aligned} & \underset{\boldsymbol{d},\boldsymbol{a},\beta}{\text{minimize }} && \|\text{diag}(\boldsymbol{d})^{\frac{1}{2}}~R~\text{diag}(\boldsymbol{d})^{\frac{1}{2}} + \text{diag}(\boldsymbol{a})^{\frac{1}{2}}~\Gamma~\text{diag}(\boldsymbol{a})^{\frac{1}{2}} - \beta S\|_F^2 \\ & \text{s. t. } && \boldsymbol{d}>\mathbb{0}, \boldsymbol{a}>\mathbb{0}, \beta>0 \end{aligned} \end{equation*} where $\boldsymbol{d}, \boldsymbol{a} \in \mathbb{R}^N$ and $\beta \in \mathbb{C}$.

This problem is definitely non-convex. My first motivation is to get solutions in an alternative manner. How do I impose the constraints. Now I am not so sure how to approach.

Any help will be appreciated.