I've been stuck with a minimization problem in hand for a while now. It's related to another question of mine (Ignoring positive (semi)definite condition for optimization), but here I'm asking about much more general directions.
I want to minimize $f:S_n^+ \to \mathbb{R}$ where $S_n^+$ is the positive semidefinite cone, and thus it's a constrained optimization problem. Currently, I'm using the Cholesky decomposition to formulate this into a unconstrained optimization problem as shown in my last comment in https://math.stackexchange.com/a/3827434/799328. I'm pretty sure this isn't really the state of the art, but I'm having trouble pinning down well-established methods under such context. Here, $f$ is smooth (it's $C^\infty$) with uniformly bounded derivatives, but it's not convex. Any help would be greatly appreciated. Thanks!
It looks like you want to minimize $f(A)$ subject to the constraint that $A$ is positive semidefinite, and you have the ability to compute the gradient of $f$ (and we note that $\nabla f(A)$ is a matrix, with the same shape as $A$).
One simple approach would be to use the projected gradient method: $$ A^{k+1} = P(A^k - t \nabla f(A^k)) $$ for $k = 0, 1, \ldots$. Here $P$ is the function that projects a given matrix onto the positive semidefinite cone. (If we work in the space $S_n$ of $n \times n$ symmetric matrices, then $\nabla f(A^k)$ will be symmetric, so $A^k - t \nabla f(A^k)$ will also be symmetric. Projecting a symmetric matrix onto the positive semidefinite cone is a standard operation; you just set the negative eigenvalues equal to $0$.)
You could also use an accelerated projected gradient method such as FISTA, which might converge much faster with only a couple extra lines of code.