Gradient Descent with nonlinear constraint on Symmetric positive definite matrix space

915 Views Asked by At

I would like to find the stationary point $S_*$ (global minimum) that minimizes the function $f(S)=\mathrm{trace}(S)+m^2\mathrm{trace}(S^{-2})$ which has been proven to be convex in Convexity of $\mathrm{trace}(S) + m^2\mathrm{trace}(S^{-2})$ for $S\in \mathcal{M}_{m,m}$ symmetric positive definite. I have also a nonlinear constraint in this optimization problem $\|S-C\|_1\leq \epsilon$ where $\epsilon$ and $C\in \mathcal{M}_{m,m}$ are given.

Actually I wasn't able to find a good optimization solver for this function. And so I think that this could be solvable using the gradient descent algorithm with projection which would lead to the global minimum of $f$ by derinving the expression of the update rule: $$S_{k+1}=S_k-\alpha_k \nabla_Sf(S)$$

This would lead to have an update rule $S_{k+1}\leftarrow h(S_k)$ where $h(S_k)$ is a function that depends on $S$ at the step $k$. And so I would like to find the expression of $h(S_k)$ for the gradient descent algorithm taking into account the projection of the gradient in the Symmetric Positive Definite space and the nonlinear constraint space.